cloud-cv / evalai-cli Goto Github PK
View Code? Open in Web Editor NEW:cloud: :rocket: Official EvalAI Command Line Tool
Home Page: https://cli.eval.ai
License: BSD 3-Clause "New" or "Revised" License
:cloud: :rocket: Official EvalAI Command Line Tool
Home Page: https://cli.eval.ai
License: BSD 3-Clause "New" or "Revised" License
This line will give a KeyError
upon running the command:
evalai challenges 340 phases
where 340 is the pk of challenge that doesn't exists.
@isht3 Can you please verify this ?
Currently, most of the request function use the same format of http request pattern except a few.
headers = get_request_header()
try:
response = requests.get(url, headers=headers)
response.raise_for_status()
except requests.exceptions.HTTPError as err:
if (response.status_code in EVALAI_ERROR_CODES):
validate_token(response.json())
echo(style("Error: {}".format(response.json()["error"]), fg="red", bold=True))
else:
echo(err)
sys.exit(1)
except requests.exceptions.RequestException as err:
echo(err)
sys.exit(1)
This could be simplified easily by abstracting this functionality as a function.
Currently, click doesn't natively support date type. We want to incorporate this feature into the CLI to accept date from the user.
Current Scenario: Tells the functioning of the function in a single line.
Deliverables:
We should add doc strings answering the following points:
All the challenges are displayed in which participant have participated i.e present as well as past on running the command evalai challenges --participant
The command evalai challenges --participant
should only display the challenges in which are ongoing and the participant have participated in them.
We should now focus on covering up all the minor test cases in order to increase the coverage to 95%.
Add badges in the readme.
The test cases here are to be checked & rectified.
Right now, we can submit a file but user should have options to fill Submission Metadata like method name
, method description
, project url
, publication url
at the time of submission.
Once we have the project in a semi-working condition, it would be nice to have automatic testing and deployment to pypi.org via travis or a similar CI/CD tool.
The copyright year in the footer of docs landing page is outdated - 2018
Make the copyright year dynamic.
Make docstrings more consistent by defining it with the following terms.
ARGS
PARSES
RAISES
Deliverables:
By their very nature, CLI tools are harder to figure out compared to GUI tools. We will need really good docs to ensure the evalai-cli
too gains traction among users.
It would be great to have automatic documentation generation from doc strings as a starting point so we can add more detailed documentation later.
We should display a server error if the API isn’t working or is down.
$ evalai challenges upcoming
Doesn't work .
Instead the function that should be invoked is named future
. REF.
$ evalai challenges future
@RishabhJain2018 @isht3 this is just a typo
kind of issue. let me know if i can propose a quick fix. 😄
Right now, line breaks are a continuous stream of hyphens, we have to change it to something aesthetically pleasing like '-'*N
.
For more details please refer #29
Right now, there are some user-specific settings that need to be configured to use the CLI with different instances of the EvalAI. Some of them are:
The default for Host URL would be the localhost:8000
, and the user could change it to their host to use different instances of EvalAI.
While displaying the future challenges, we should display the start-date instead of end-date.
We should display an error that the token is expired and ask the user to generate again.
Currently, when the command evalai set_token <auth_token>
is copied to clipboard, then the content which is copied is evalai set_token <auth_token>
.
The copied content should be evalai set_token <auth_token>
We should have templates that describe to new contributors what the expectations of the project are. This requires putting the files in a .github
folder to allow Github to render them appropriately. Below is a list of files that will be needed to close this issue.
CONTRIBUTING.md
ISSUE_TEMPLATE.md
PULL_REQUEST_TEMPLATE.md
Users should be able to create a challenge configuration zip using CLI. The idea is to have a similar setup that Sphinx provides when you want to generate the documentation configuration. It should work something like this:
Command: evalai create_challenge
$ evalai create challenge
Enter the challenge name:
Test Challenge
Enter the Number of Phases:
2
Enter the Phase 1 name:
Test Phase
Enter the Phase 1 codename:
test-phase
.
.
.
.
.
.
CC: @isht3 @varunagrawal @RishabhJain2018
Get all challenges
evalai challenges list
Show number of challenges in which participant has participated
evalai challenges list -participate true
Show number of challenges in which participant has hosted
evalai challenges list -host true
Get current challenges
evalai challenges list ongoing
Get past challenges
evalai challenges list past
Get future challenges
evalai challenges list future
Get details about the phases of a challenge
evalai challenges phases list -c 1
Get details about a particular phase of a challenge
evalai challenges phases -c 1 -p 2
Get all participant teams of the user
evalai teams list
Create a participant team
evalai teams create
Participate in a challenge
evalai challenges participate -c 1 -pt 123
Make submissions to the challenge phase
evalai challenges submit -c 1 -p 2 -file submission.json
Get status of a particular submission
evalai submissions -s 1234
Get all the challenge phase splits of a phase in a challenge
evalai challenges phases splits -c 1 -p 2
Pretty print the leaderboard of a challenge
evalai challenges leaderboard -c 1 -cps 3
Show submission stats for a challenge (both for participants and hosts)
evalai challenges submissions stats -c 1
Show submission stats for a challenge phase (both for participants and hosts)
evalai challenges submissions stats -c 1 -p 2
Display number of submissions happened between two dates for a challenge phase
evalai challenges submissions -d1 mm/dd/yyyy -d2 mm/dd/yyyy
Display number of submissions happened between two dates for a challenge phase with filtering by participant team.
evalai challenges submissions -d1 mm/dd/yyyy -d2 mm/dd/yyyy -pt 12
If a user wants to make submissions using evalai push
command, then if docker isn't running on the users machine then the error thrown is shouwn below and it will confuse the user.
Traceback (most recent call last):
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/urllib3/connectionpool.py", line 600, in urlopen
chunked=chunked)
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/urllib3/connectionpool.py", line 354, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 1239, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 1285, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 1234, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 1026, in _send_output
self.send(msg)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 964, in send
self.connect()
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/docker/transport/unixconn.py", line 42, in connect
sock.connect(self.unix_socket)
ConnectionRefusedError: [Errno 61] Connection refused
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/requests/adapters.py", line 449, in send
timeout=timeout
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/urllib3/connectionpool.py", line 638, in urlopen
_stacktrace=sys.exc_info()[2])
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/urllib3/util/retry.py", line 367, in increment
raise six.reraise(type(error), error, _stacktrace)
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/urllib3/packages/six.py", line 685, in reraise
raise value.with_traceback(tb)
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/urllib3/connectionpool.py", line 600, in urlopen
chunked=chunked)
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/urllib3/connectionpool.py", line 354, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 1239, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 1285, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 1234, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 1026, in _send_output
self.send(msg)
File "/Library/Frameworks/Python.framework/Versions/3.6/lib/python3.6/http/client.py", line 964, in send
self.connect()
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/docker/transport/unixconn.py", line 42, in connect
sock.connect(self.unix_socket)
urllib3.exceptions.ProtocolError: ('Connection aborted.', ConnectionRefusedError(61, 'Connection refused'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/bin/evalai", line 10, in <module>
sys.exit(main())
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/click/core.py", line 722, in __call__
return self.main(*args, **kwargs)
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/click/core.py", line 697, in main
rv = self.invoke(ctx)
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/click/core.py", line 895, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/click/core.py", line 535, in invoke
return callback(*args, **kwargs)
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/evalai/submissions.py", line 53, in push
docker_image = docker_client.images.get(image)
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/docker/models/images.py", line 312, in get
return self.prepare_model(self.client.api.inspect_image(name))
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/docker/utils/decorators.py", line 19, in wrapped
return f(self, resource_id, *args, **kwargs)
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/docker/api/image.py", line 245, in inspect_image
self._get(self._url("/images/{0}/json", image)), True
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/docker/utils/decorators.py", line 46, in inner
return f(self, *args, **kwargs)
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/docker/api/client.py", line 215, in _get
return self.get(url, **self._set_request_timeout(kwargs))
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/requests/sessions.py", line 537, in get
return self.request('GET', url, **kwargs)
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/requests/sessions.py", line 524, in request
resp = self.send(prep, **send_kwargs)
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/requests/sessions.py", line 637, in send
r = adapter.send(request, **kwargs)
File "/Users/rishabhjain/Documents/Projects/evalai-cli/env/lib/python3.6/site-packages/requests/adapters.py", line 498, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', ConnectionRefusedError(61, 'Connection refused'))
The error thrown should be Docker isn't running on your system. Please start docker before making submissions
evalai login <username>
which will prompt for the password
.token.json
file at ~/.evalai/token.json
in the user's system.Error message like you have not participated in the challenge
is not shown when a user tries to submit to a challenge without participating in it.
The command should be like evalai challenge create --file=<path_to_zip_file>
Please feel free to ask if you have doubts.
We need to figure out a better way to save the token file name.
One way could be to have a config.json
that saves these values and is loaded at startup. This will allow us to not only save custom config file names but also custom locations.
We can prompt the user at startup to enter these values if they are missing, providing the current values as defaults (similar to how ssh keys are generated).
Command copied to clipboard
.pip install evalai
automatically create ~/.evalai
?Current Scenario:
Challenge statistics is visible to both challenge hosts and challenge participants.
Deliverabes:
The challenge statistics should only be visible to challenge host.
The command should be evalai gettoken
The output should be the token which is set at ~/.evalai/token.json file
.
I am not sure if this is necessary. But when I tried running a few commands, I saw a warning to add token.json. That's why I created this issue
The token which is placed at ~/.evalai/token.json
gets reset whenever a user runs test cases on his local machine.
Steps to reproduce:
~/.evalai/token.json
.py.test
.~/.evalai/token.json
The token value at step 1 != value at step 3
.
Description: Revert the pull request #2 and merge again by squashing the changes into 1 merge commit with a proper commit message.
Add support in ongoing to filter out past challenges.
Since the coverage is 72%
which is quite low, hence we should add test cases in order to cover all the code, before merging the new PR's.
Deliverables: Increase the coverage to 90%
.
Change the error message when token file isn’t present in specified directory.
The message should be:
The authentication token json file doesn't exists at the required path.
Please download the file from the Profile section of the EvalAI webapp and place it at ~/.evalai/token.json
The function get_challenge_count
called when either participant or host is True
first checks for host and if host is false, checks for participant.
So after activating both flags, only the host challenges are printed. We have to redesign the get_challenge_count
function to account for this (and possibly rename it too).
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.