docker-science / cookiecutter-docker-science Goto Github PK
View Code? Open in Web Editor NEWCookiecutter template for data scientists working with Docker containers
Home Page: https://docker-science.github.io/
License: Apache License 2.0
Cookiecutter template for data scientists working with Docker containers
Home Page: https://docker-science.github.io/
License: Apache License 2.0
Add Requirements section and Docker.
We added additional docker files such as Dockerfile.dev and Dockerfile.release. I would like to add the descritpion the usage in the devleopyment cycles.
Currently some targets work only in Docker containers. I would like to make such targets executable only in Docker containers.
Support documentation with Sphinx.
Hi, Thanks for the wonderful tool.
I always use cookiecuttor-docker-science on Ubuntu16.04, but the make init-docker
did not work when trying to use it on macOS.
first, create repository with default settings.
$ cookiecutter [email protected]:docker-science/cookiecutter-docker-science.git
project_name [project_name]: test-coociecutter-docker-science
project_slug [test_coociecutter_docker_science]:
jupyter_host_port [8888]:
description [Please Input a short description]:
Select data_source_type:
1 - s3
2 - nfs
3 - url
Choose from 1, 2, 3 [1]:
data_source [Please Input data source]:
Select use_nvidia_docker:
1 - no
2 - yes
Choose from 1, 2 [1]:
second, execute make init-docker
$ cd test_coociecutter_docker_science
$ make init-docker
docker build -t test_coociecutter_docker_science-image -f docker/Dockerfile --build-arg UID=1663316204 .
Sending build context to Docker daemon 19.97kB
Step 1/10 : FROM ubuntu:16.04
---> 20c44cd7596f
Step 2/10 : RUN apt-get update && apt-get install -y git python3.5 python3-pip python3-dev
---> Using cache
---> 8bb9d3f9bec9
Step 3/10 : RUN pip3 install --upgrade pip
---> Using cache
---> 5e906e1cbec5
Step 4/10 : COPY ./requirements.txt /requirements.txt
---> Using cache
---> 91d1c70b7126
Step 5/10 : RUN pip install -r /requirements.txt
---> Using cache
---> aac86f8ffef4
Step 6/10 : ARG UID
---> Using cache
---> 1b13e2d797c6
Step 7/10 : RUN useradd docker -u $UID -s /bin/bash -m
---> Running in 416e1cfe122a
Error processing tar file(exit status 1): write /var/log/faillog: no space left on device
make: *** [init-docker] Error 1
When building docker image, can not I set macOS’s UID(1663316204) to Ubuntu?
I deleted the following in Dockerfile and I could manually set it after create container.
root@08bfc3edbca1:/work# useradd docker -u 1663316204 -s /bin/bash -m
root@08bfc3edbca1:/work# su docker
docker@08bfc3edbca1:/work$ id -u
1663316204
So I added entrypoint.sh and I fix it like this.
I hope this will useful for you.
I use Docker for Mac.
$ docker version
Client:
Version: 18.06.1-ce
API version: 1.38
Go version: go1.10.3
Git commit: e68fc7a
Built: Tue Aug 21 17:21:31 2018
OS/Arch: darwin/amd64
Experimental: false
Server:
Engine:
Version: 18.06.1-ce
API version: 1.38 (minimum version 1.12)
Go version: go1.10.3
Git commit: e68fc7a
Built: Tue Aug 21 17:29:02 2018
OS/Arch: linux/amd64
Experimental: true
When starting make init
in a python environment without aws-cli
, I got the following error:
pyenv: aws: command not found
Should awscli
be in requirements.txt
when using s3 data sources?
Add test environments for 2.6, 2.7
requirements.txt contains libraries for developments such as Sphinx or flake8. These libraries should be included in requirements_dev.txt.
docker@8c350098e734:/work$ make
/bin/sh: 1: python: not found
Makefile:44: recipe for target 'help' failed
make: *** [help] Error 127
Currently make clean-docker
removes image and containers. I would like to have a command to remove only container.
docker-build:
...
docker-build-without-cache
...
Create PR cookiecutter/cookiecutter#1040
pipenv is the officially recommended Python packaging tool from Python.org.
I think it's good to support pipenv on cookiecutter-docker-science.
Currently input data need to be stored in S3. I would like to support shared directory or other urls.
Sometime, I failed to create docker with make create-container
with the port is already occupied. I would like to set the port with random from 5000 to 9999.
When we run mutiple Dockerfile or requirements.txt files for separated purpose, we need to specify the setting through envrionment variables described in #53.
But setting such variables though the command line parameters are tedious, and therefore I would like to add a basic setting files to add environment setting. the following is the sample of the setting file (.env
).
DOCKERFILE=docker/Dockerfile.test
REQUIREMENTS=test_requirement.txt
The template does not generate .env
directory but the .env_template
file not to load the setting until when users change the name of the file .env_template
to .env
and add the setting to the file.
I would like embed the slides for overview of this project into README
I got the following warning in building an docker image with make init-docker
WARNING: The scripts f2py, f2py3 and f2py3.6 are installed in '/home/docker/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
WARNING: The scripts jupyter, jupyter-migrate and jupyter-troubleshoot are installed in '/home/docker/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
WARNING: The script jsonschema is installed in '/home/docker/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
WARNING: The script jupyter-trust is installed in '/home/docker/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
WARNING: The script pygmentize is installed in '/home/docker/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
WARNING: The scripts iptest, iptest3, ipython and ipython3 are installed in '/home/docker/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
WARNING: The scripts jupyter-kernel, jupyter-kernelspec and jupyter-run are installed in '/home/docker/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
WARNING: The script jupyter-nbconvert is installed in '/home/docker/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
WARNING: The scripts jupyter-bundlerextension, jupyter-nbextension, jupyter-notebook and jupyter-serverextension are installed in '/home/docker/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
WARNING: The script jupyter-console is installed in '/home/docker/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
WARNING: The script chardetect is installed in '/home/docker/.local/bin' which is not on PATH.
Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
I failed to run make jupyter
in the docker container. The problem is fixed adding the path to envrionment variable $PATH as the warning describes.
export PATH=$PATH:/home/docker/.local/bin
I would like to run targets without running the workaround.
The make init-docker
takes a lot of time.
This is detrimental to developer experience.
I think the make init-docker
should be faster.
(Currently no idea 😇 )
pwd
command is missing-u
option cases a problem in docker/Dockerfile
I would like to replace CI with Travis to GitHub actions or CircleCI since Travis might not provide free plan for open source projects in the future.
type-check: ## check types with mypy
mypy -p package-name
Need to add mypy
.
I'd like to add packaging scripts.
I would like see the possible options as the help
target.
Python's default unittest is outdated.
Pytest is more useful.
To users understand the basic flow, Docker Science should provide the default target which init data and container and then run the flow to generate machine learning models.
I would like to specify Dockerfile with environment variable.
make DOCKERFILE=docker/Dockerfile.cpu
I would run only specified test cases. The following is the implementation.
export TARGET_TEST_CASE=tests.simple_application.TestApplication
run-specified-test-case: ## Run specified test case
$(PYTHON) -m unittest $(TARGET_TEST_CASE)
We run test this target with make run-specified-test case
/work
in the containerNeed tutorial and some description on handling projects with multiple Dockerfiles.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.