lablup / backend.ai-client-py Goto Github PK
View Code? Open in Web Editor NEWBackend.AI Client Library for Python
License: MIT License
Backend.AI Client Library for Python
License: MIT License
from ai.backend.client import Kernel
로 되있으나 실제로는
from ai.backend.client.kernel import Kernel
해야 작동이 됩니다.
또한 위와 같이 변경하여 아래와 같이 요청을 시도하여도
from ai.backend.client.kernel import Kernel
kern = Kernel.get_or_create('python', client_token='abcdef')
result = kern.execute('print("hello world")', mode='query')
print(result['console'])
kern.destroy()
AssertionError: You must use API wrapper functions via a Session object.
에러가 발생합니다.
get_or_create가 클래스 메소드이다보니 세션을 어느시점에 어떻게 넣어줘야 하는지 궁금합니다.
마지막으로 사소한 부분이긴 하나 실세 샘플 코드에서는 클라이언트 토큰이 'abc'로 적혀있는데
실제로 그대로 돌려볼경우
assert 4 <= len(client_token) <= 64, 'Client session token should be 4 to 64 characters long.'
에러가 막히지 않을까 싶습니다. 그래서 4자 이상으로 변경하시면 처음 시도하시는 분이 에러없이 테스트 해보실수 있지 않을까 싶습니다.
It would be better to have a way to check username for logged in user.
config.json
file in appdir in login with username information.config.json
when logout.Support APIv2 and refactor internal implementations.
Automatically use --legacy option when server API is v3.
This is for forward compatibility.
Currently requests
module automatically handles redirects but the authentication header is not updated and causes authorization failures.
New Sorna gateway will support custom HTTP methods using X-Method-Override
header.
Let's add a configuration option to enable to use this instead of directly sending custom HTTP methods.
A new optional argument:
error_if_not_exists
(default: False
for backward-compatilibity): If set True
, raise an error when the kernel for the specified client-side session token does not exist.Thanks to @gofeel for clarification.
Let's write a basic Python client library that conforms with the API specification.
For integration plugins (e.g., Jupyter), it is often impossible or difficult to provide an interface to specify detailed kernel creation options. We could allow users to specify those via environment variables.
--args
argument so that all later arguments are passed as if they are the arguments in the --exec
option.Related to lablup/backend.ai-manager#26.
e.g., Each kernel objects should be able to use different API configurations, like the low-level request objects.
It's September 2019.
Let's add a command python -m ai.backend.client.cli proxy
which opens an insecure HTTP server where all requests are proxied to the original API server with configured authorization headers.
This will allow use of many convenient API development tools such as GraphiQL.
pip install backend.ai-client
.backend.ai download <sess-id> <filepath>
command is broken with following traceback:
Traceback (most recent call last):c51362187f653ee5745d6906b...
File "/Users/adrysn/.pyenv/versions/playground/bin/backend.ai", line 11, in <module>
sys.exit(main())
File "/Users/adrysn/.pyenv/versions/3.7.0/envs/playground/lib/python3.7/site-packages/ai/backend/client/cli/__init__.py", line 104, in main
args.function(args)
File "/Users/adrysn/.pyenv/versions/3.7.0/envs/playground/lib/python3.7/site-packages/ai/backend/client/cli/__init__.py", line 40, in wrapped
handler(args)
File "/Users/adrysn/.pyenv/versions/3.7.0/envs/playground/lib/python3.7/site-packages/ai/backend/client/cli/files.py", line 44, in download
kernel.download(args.files, show_progress=True)
File "/Users/adrysn/.pyenv/versions/3.7.0/envs/playground/lib/python3.7/site-packages/ai/backend/client/base.py", line 79, in _caller
return self._handle_response(resp, gen)
File "/Users/adrysn/.pyenv/versions/3.7.0/envs/playground/lib/python3.7/site-packages/ai/backend/client/base.py", line 37, in _handle_response
meth_gen.send(resp)
File "/Users/adrysn/.pyenv/versions/3.7.0/envs/playground/lib/python3.7/site-packages/ai/backend/client/kernel.py", line 191, in _download
total=resp.stream_reader.total_bytes,
AttributeError: 'Response' object has no attribute 'stream_reader'
Meanwhile, downloading from vfolders is working.
The current Python API has many optional parameters (with default values) but they can be passed as either positional or keyword arguments.
Let's restrict them to be passed as only keyword arguments, so that adding new arguments does not break existing Python API-based codes elsewhere.
http://docs.backend.ai/en/latest/gsg/clientlib.html 현재 sorn-client는 pip에서 0.9.3버전이라
pip install backend.ai-client
로 변경 및 해당문서의 샘플 코드도 갱신이 필요해 보입니다.
샘플코드의 경우 아래와 같이 변경하여 연동을 시도해봤습니다.
from ai.backend.client.request import Request
req = Request('GET', '/authorize', {'echo': 'test'})
rep = request.send()
print(rep.status)
print(rep.json())
하지만 새로이 변경된 클라이언트는 메소드 대신 세션을 요구하는데 세션 생성에 관련한 가이드라인 문서가 없어
연동에 어려움이 있습니다.
.
Sometimes, there is a need to check specific container logs.
Currently there is no way to download any generated output files from vfolders/kernels, except those placed in .output
directory which are uploaded to S3 automatically by the agent.
Refs:
Configuring a network is often painful for end-users.
Let's provide a better way in the CLI of sorna client, and let the users just open "localhost:8081" to magically connect to a web page served by the kernel session.
We could use the SSH socks proxy or just some custom-built ones via WebSockets.
git/shell kernels and some REPL languages can be launched as a terminal app with pseudo-terminal.
We could directly connect them to the user's local terminal, just like using a cloud desktop.
Go to https://github.com/lablup/backend.ai/issues and this issue tracker is preserved for the archiving purpose only.
lcc
: C/C++ on lablup.ai cloud.lpython
: python on lablup.ai cloud.Let's add commands for managing agent through agent watcher.
Currently, the memory unit is GB: 0.5 == 512 MiB
, 1 == 1024MiB
. Let use can deliver humanized unit. For example, -r ram=512mb
, -r ram=2.5GB
.
I think stripping "b" would be better though we could optionally support it. (e.g., -r ram=512m)
Also, how about using -r mem=... instead of -r ram=...? (or maybe we could treat ram, mem, memory as the same key)
Let's apt-get install backend.ai-client
.
Top priority: 18.04
Second priorities: 18.10, 19.04
Dropped: 16.04 due to Python 3.5.1
After removing no longer needed argument when using client.request.Request
instances, AttributeError: 'ComputeSession' object has no attribute 'id'
was occured in src/ai/backend/client/versioning.py:33
with the command, backend.ai app ID jupyterlab
.
def get_id_or_name(api_version: Tuple[int, str], obj: ComputeSession) -> str:
if api_version[0] <= 4:
return obj.name
return obj.id
As shown in the picture below, the variable api_version[0]
was printed 6
, but ComputeSession
had no attribute 'id'
. When the version check was removed, it worked normally again. Should the version check be removed from the source code?
backend.ai config
command displays current settings including API endpoint and API version. However, the API version displays always the latest version (v5.20191215 for current master). For example, https://api.backend.ai endpoint currently serves v4.20181215 as shown below:
When running backend.ai config
, you'll see API version of v5.20191215.
I think we need to display the correct API version of the endpoint by fetching the information from the endpoint. Or, at least we need to provide users to be able to set API version by environment variables, such as BACKEND_API_VERSION
.
session = Session()
vf = session.VFolder
name = "first_vf"
vf.create(name)
vf.get(name)
을 할경우 아래와 같이 에러가 발생합니다.
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
<ipython-input-24-586f79b79e3d> in <module>
----> 1 vf.get('first_vf')
~/code/brgg/CodeLion/.venv/lib/python3.6/site-packages/ai/backend/client/base.py in _caller(cls, *args, **kwargs)
63 'You must use API wrapper functions via a Session object.'
64 gen = meth(*args, **kwargs)
---> 65 resp = cls._make_request(gen)
66 return cls._handle_response(resp, gen)
67
~/code/brgg/CodeLion/.venv/lib/python3.6/site-packages/ai/backend/client/base.py in _make_request(gen)
49 @staticmethod
50 def _make_request(gen):
---> 51 rqst = next(gen)
52 resp = rqst.fetch()
53 return resp
TypeError: 'VFolder' object is not an iterator
get메소드는 다른 메소드와 달리 별도의 처리가 되지 않아 발생한 상황인듯 하여 확인 부탁드립니다.
Let's use appdirs for:
cookie.dat
for session-based auth modeargparse
module is not working as expected.
$ backend.ai run python --exec "python hello.py" hello.py
usage: backend.ai [-h] {help,config,run,proxy,admin,ps} ...
backend.ai: error: unrecognized arguments: hello.py
python -m sorna
or python -m sorna.client
--build
, --exec
, --lang
, --mount
, --limit
, --gpu
params. If some params are omitted, infer them with some heuristics: e.g., choose the only script file in the arguments as --exec
target; if the argument contains a Makefile then use it as --build
target.)--sess-id
to manually specify client-side session ID (i.e., unique identifier of the session). If omitted, auto-create a new session and show an arbitrary ID so that the user can refer it in subsequent executions.-d
/ --detach
option--rm
option (like Docker).https://github.com/pallets/click provides convenient interface for composing shell commands. Let's adapt it.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.