Code Monkey home page Code Monkey logo

anarel-manage's People

Contributors

chrisvam avatar davidslac avatar hblair avatar msdubrovin avatar slacmshankar avatar tjlane avatar valmar avatar weninc avatar

Stargazers

 avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

anarel-manage's Issues

force opencv version >=3

with ana-1.2.7, we opencv got downgraded from 3.x defaults --> 2.4 conda-forge, and pillow went up from 3.x --> 4.x. I can by upgrade opencv after the fact, and this causes the new pillow to go back down.

Will do this for the ana-1.2.7 series, I get

(manage) (psreldev) psel701: /reg/g/psdm/sw/conda $ conda install opencv=3.1.0 --name ana-1.2.7
Fetching package metadata .......................
Solving package specifications: .

Package plan for installation in environment /reg/g/psdm/sw/conda/inst/miniconda2-prod-rhel7/envs/ana-1.2.7:

The following packages will be UPDATED:

    opencv:  2.4.13-np111py27_1 conda-forge --> 3.1.0-np111py27_1 defaults   

The following packages will be DOWNGRADED due to dependency conflicts:

    jasper:  1.900.1-4          conda-forge --> 1.900.1-1         conda-forge
    jpeg:    9b-0               defaults    --> 8d-2              defaults   
    libtiff: 4.0.6-3            defaults    --> 4.0.6-2           defaults   
    pillow:  4.1.0-py27_0       defaults    --> 3.4.2-py27_0      defaults   

Proceed ([y]/n)? y

jasper-1.900.1 100% |#####################################################################################################################################################################################| Time: 0:00:00 500.54 kB/s

new packages

conda-build 2.1.8
tensorflow 1.1.0-rc0
hdf5 1.10.1-pre1
h5py 2.7 final
keras 2.0.0 I think

new packages

h5py 2.7 rc3
git
cmake
conda - does sed -E fix work on rhel5? 4.3.8
conda-build 2.1.2
tf 12
keras

update tensroflow

now that our gpu's have cuda 8, can update to latest tensorflow and cudnn

specify test release path through conda_setup.

@slaclab/psdm and @chrisvam would like to specify a test release through conda_setup. Now your current directory has to be a test release for it to work.

I have added this feature, it is in production. Do

source conda_setup --reldir /path/to/my/testrelease

please try, close if looks good.

automate documentation generation

We made /reg/g/psdm/sw/conda/web

then we have

ana-releases
ana-releases/psana-conda-1.2.7

and web/ana -> ana-releases/psana-conda-1.2.7

in psana-conda-1.2.7, we need to generate a index.hml, that is a table with links to the checked in documentation, for pacakges, like PyDataSource --> /reg/g/psdm/sw/conda/scratch/psana-conda-1.2.7/PyDataSource/doc/build/html

nightly builds

We need to automate nightly builds. Here are manual steps

source conda_setup

this makes the management code available, and a central conda install

Directory Structure

We'll do this in our own account for development. So make some directories like

mkdir conda-root
cd conda-root
mkdir downloads
cd downloads
mkdir anarel

cd back to conda-root
do

pwd

to see what it is. Say it is /reg/neh/home/davidsch/projects/conda-root

Latest Tags

When someone checks in a new tag, we want to first want to check out all the psana source code.

In production, as psreldev, we'll do

ana-rel-admin --force --cmd psana-conda-src --name 9.9.9  

Where we're just picking some crazy version like 9.9.9, and we'll plan on overwritting our previous work, so we use --force.

For development, we can do this

ana-rel-admin --cmd psana-conda-src --name 9.9.9 --basedir /reg/neh/home/davidsch/projects/conda-root --tagsfile /reg/g/psdm/sw/conda/manage/config/psana-conda-svn-pkgs-tst 

You may need to do kinit beforehand. And github has to work.

Psana Recipe

we need to make a recipe to build a psana 9.9.9

cp -r /reg/g/psdm/sw/conda/manage/recipes/psana/psana-conda-opt .

edit the version line to use your 9.9.9

and edit the source file line

for me it is

source:
  fn: /reg/neh/home/davidsch/conda-root/downloads/anarel/{{ pkg }}-{{ version }}.tar.gz

Build Psana

Now do

conda-build --no-locking psana-conda-opt

We want to know that this works on rhel5, rhel6, rhel7

install cern root

Phil has requested root. I'm concerned root will create problems for our installations. You need to get it from the channel NLeSC, there are notes on this page https://nlesc.gitbooks.io/cern-root-conda-recipes/content/installing_root_via_anaconda.html a dryrun shows

conda install -c NLeSC root --name ana-1.2.0

Package plan for installation in environment /reg/g/psdm/sw/conda/inst/miniconda2-dev-rhel7/envs/ana-1.2.0:

The following NEW packages will be INSTALLED:

    cloog:         0.18.0-0            defaults
    fftw:          3.3.4-2             NLeSC   
    gcc:           4.8.2-25            NLeSC   
    glibc:         2.12.2-3            NLeSC   
    gmp:           5.1.2-3             NLeSC   
    graphviz:      2.38.0-4            NLeSC   
    gsl:           1.16-2              NLeSC   
    isl:           0.12.2-2            NLeSC   
    linux-headers: 2.6.32-1            NLeSC   
    mpc:           1.0.1-2             NLeSC   
    mpfr:          3.1.2-2             NLeSC   
    root:          6.04-py2.7_gcc4.8.2 NLeSC   

What is most concerning is the older gcc (we have 4.8.5 on the rhel7 machines) with presumably the older glibc. I think Kareem has been trying to make conda environments with root and boost and png libraries and has been running into problems.

python 3.6

We can't move to python 3.6 until all the packages we put in the py3 environment support it. The first conflict I got was with jupyterhub.

matplotlib 2.0.0

Wondering why we didn't pickup the latest matplotlib with ana-1.2.0, doing a dryrun of installing it shows the changes we would get:

Package plan for installation in environment /reg/g/psdm/sw/conda/inst/miniconda2-dev-rhel7/envs/ana-1.2.0:

The following NEW packages will be INSTALLED:

    jasper:       1.900.1-3         conda-forge
    olefile:      0.44-py27_0       defaults   
    subprocess32: 3.2.7-py27_0      defaults   

The following packages will be UPDATED:

    freetype:     2.5.5-2           defaults    --> 2.6.3-1            conda-forge
    icu:          54.1-0            defaults    --> 56.1-4             conda-forge
    jpeg:         8d-2              defaults    --> 9b-0               defaults   
    libpng:       1.6.27-0          defaults    --> 1.6.28-0           conda-forge
    libtiff:      4.0.6-2           defaults    --> 4.0.6-3            defaults   
    matplotlib:   1.5.1-np111py27_0 defaults    --> 2.0.0-np111py27_1  conda-forge
    pillow:       3.4.2-py27_0      defaults    --> 4.0.0-py27_1       conda-forge

The following packages will be DOWNGRADED due to dependency conflicts:

    cairo:        1.14.8-0          defaults    --> 1.14.6-0           conda-forge
    fontconfig:   2.12.1-2          defaults    --> 2.11.1-6           conda-forge
    harfbuzz:     0.9.39-2          defaults    --> 0.9.39-1           defaults   
    opencv:       3.1.0-np111py27_1 defaults    --> 2.4.12-np111py27_2 conda-forge
    pango:        1.40.3-1          defaults    --> 1.40.1-0           conda-forge

looks like it is not in defaults, and would downgrade opencv

Latest jinja2 (2.9.5) brakes ddl codegen

ana-1.2.2 has jinjai2 2.9.5. When psddlc from the psddl repo is used to generate the pdsdata code with it, the code is incorrect. You see a generated line that looks like

* var = ...

instead of

Evr::OuputMapV2 * var = ...

so that typename is gettting lost as a template parameter for the code generation.

The jinjai2 2.7 to 2.8 introduced a bug that was very difficult to track down. I don't know that we should spend that much time with this bug. Two options

Special Environment for DDL

Then we can always have the latest jinja2 in ana-current.

We'dl make a special environment with jinjai2 2.8 to compile the DDL, but development and testing should be done in ana-current. I think this environment needs

  1. scons
  2. psana-conda
  3. maybe cython

I guess I'll put in all the build dependencies of psana-conda. I'll have to make the envs.sh files to set environment variables also. We'll call the environment

ddl

and put it in all six, dev/prod rhel5/6/7. We may have to periodically update psana-conda for these environments. We should probably pin jinjai in them?

Pin Jinja

Or should I just pin jinja2 to 2.8.1? Downgrading jinja2 to 2.8.1 introduced no other changes in packages, for ana-1.2.2, so I think I'll do that.

make package around hexanode lib?

A bug in SConsTools:
https://jira.slac.stanford.edu/browse/PSRT-168

is breaking the RPM builds (search for " error" in below):

http://pswww.slac.stanford.edu/buildbot/builders/nightly-build-rhel7/builds/531/steps/build-opt-all/logs/stdio

We won't be able to build a new psana release until we get it in conda properly.

Right now it is installed at

/reg/common/package/hexanodelib/0.0.1/x86_64-centos7-gcc485

among the files

/reg/common/package/hexanodelib/0.0.1/x86_64-centos7-gcc485 $ ls -1
calibration_table.txt   # this is part of the example sort.cpp below, it reads this file
compile_x64.txt   # shows how we'll complie, g++ -o sort -m64  -Wall -O3 sort.cpp libResort64c_x64.a
                            # we don't need the -m64 here, its default
compile_x86.txt
libResort64c_x64.a                 # This looks like all we get for linux, a static lib
libResort64c_x86.a                  # wrong arch
resort64c.h                               # we'll put this at $CONDA_PREFIX/include/hexanodelib/resort64ch.h

resort64c_VS2010_x64.dll           # looks like all of this if for windows, visual studio?
resort64c_VS2010_x64.lib              
resort64c_VS2010_x86.dll
resort64c_VS2010_x86.lib

sort.cpp                                          # example

sort.sln                             # visual studio project files
sort.vcxproj
sort.vcxproj.filters
sort.vcxproj.user

sort_non-LMF_from_1_detector.zip
sorter.txt                                      # example config file for sort.cpp example

So we'll just take

libResort64c_x64.a -> $CONDA_PREFIX/lib 
resort64c.h              -> $CONDA_PREFIX/include/hexanodelib/resort64c.h

then we'll have to make a conda proxy package around it, like
https://github.com/lcls-psana/hdf5/blob/conda/SConscript

and code will call via

or should the package be called hexanode? not hexanodelib

source conda_setup from within script - error with command line arguments

@koglin ran into an issue using conda_setup from a script.

Users may write scripts like

#!/bin/bash

source conda_setup
# process command line arguments

However, per http://superuser.com/questions/1029431/how-to-prevent-source-in-a-bash-script-from-passing-the-scripts-arguments, conda_setup will see the scripts command line argument, and currently error out when it processes them.

Looks like the best practice is for users to do

source conda_setup ""

to not pass the scripts command line arguments.

I could also look into

http://stackoverflow.com/questions/12818146/python-argparse-ignore-unrecognised-arguments

to not complain about unrecognized arguments, but this doesn't seem as robust. Some of the user script arguments may coincidentally be the same as those of conda_setup, and outside the script, we'd rather throw an error.

add yaml-cpp

Using this package for lc2 hdf5 work, should integrate it more

subsequent success/failure emails for same step are rejected

The automation generates an email with failure/success headers here:

https://github.com/slaclab/anarel-manage/blob/master/pylib/anarelmanage/automation.py#L19

and

https://github.com/slaclab/anarel-manage/blob/master/pylib/anarelmanage/automation.py#L28

it sends it to pcds-ana-l, but that rejects it if the subject line is something it has seen before. One could modify the subject lines to include the date and time in order to make a unique subject.

Disable nb_conda_kernels

@weninc We get a long list of conda environments when jhub is started, it is generated by the nb_conda_kernels package.

If we remove the file
/reg/g/psdm/sw/conda/inst/miniconda2-prod-rhel7/envs/ana-1.1.0-py3/etc/jupyter/jupyter_notebook_config.json

then we only get the kernels that we are configuring via our own json files

Options
1 remove this file as a postprocessing step
2 manage our own _nb_ext_conf package, which installs this file and others
3 hack nb_conda_kernels package to generate the correct packages for us, that is this line
https://github.com/Anaconda-Platform/nb_conda_kernels/blob/master/nb_conda_kernels/manager.py#L184
loops through all the environments, and this line
https://github.com/Anaconda-Platform/nb_conda_kernels/blob/master/nb_conda_kernels/manager.py#L194
currently doesn't set the environment variables we need (SIT_DATA) that we handline in #10 , so we wouldn't do #10 if we forked nb_conda_kernels

the file we might delete with 1) has the content

(ana-1.1.0-py3) (psreldev) psel701: /reg/g/psdm/sw/conda/inst/miniconda2-prod-rhel7/envs/ana-1.1.0-py3/etc/jupyter $ cat jupyter_notebook_config.json.remove
{
  "NotebookApp": {
    "kernel_spec_manager_class": "nb_conda_kernels.CondaKernelSpecManager",
    "nbserver_extensions": {
      "nb_conda": true,
      "nb_anacondacloud": true,
      "nbpresent": true
    }
  }
}

that is it loads this new conda kernel spec -- we are currently loading the kernels we want through the standard kernel spec - so this new one must run the standard, and do more - call old method, add their own, and return all environments.

Breaking up psana

Right now psana-conda is one big package with lots of stuff. This causes problems, one is for - developing pdsdata, you end up having two copies of pdsdata around, the old one, and the one you are developing. Developing pdsdata should look like

conda create --name ddl-conda psddl
github clone lcls-psana/psddldata
svn checkout pdsdata
...

That is you just create a lightweight environment with the ddl compiler, that's all you need as a DAQ developer, you don't need a big conda environment with another copy of pdsdata hiding in it, as well as another copy of psddldata.

So, a plan for breaking up psana-conda

  1. SConsTools gets its own package, and it knows how to install small things into conda environments - now we can split off any other package in the ana build system and create a recipe that installs it in a conda environment
  2. psddl gets its own package
  3. psddldata might not be a conda package, it is something we checkout to make psana
  4. Split off Translator, get static openmpi link dependency out - right now psana-conda run dependencies don't specify which openmpi, but this means the Translator won't run if one uses mpich
  5. I'd be for a 'core' psana - all the code to parse XTC files and create events, one package
  6. Then more algorithmic stuff
  7. Of course split of things like psmon and psgeom that just use psana, but maybe figure out what that core package is and if psmon etc can just depend on that?

conda_setup message wrong for test release

I used conda_setup to actvate a test release build against dev-ana-1.2.0, but it says it is activating a production environment. Need to fix, or remove, if make quite default

How to develop new packages that depend on psana?

Packages - conda, pip, SConsTools

Our scons build system allows us to develop new "packages" that become part of the psana-conda package. I say "packages" in quotes because they are not conda or python packages - they are packages as defined by the SConsTools build system we use for psana. By developing a new SConsTools package, we make psana-conda bigger.

Moving forward, we would like to develop new packages as standard python or conda packages when it makes sense. For example, if someone is developing a pure python package that interfaces with psana just through a import psana, I think it wold be better to develop it as a standard python package -- i.e, write a setup.py. @slacmshankar has done this with logbookclient, and now @tjlane has pscache.

We'll go through how to add such package like pscache into the ana environments. All of this will be done as the admin account psreldev.

Create Recipe

as psreldev

cd /reg/g/psdm/sw/conda/manage/recipes/external
cp -r logbookclient pscache

now edit pscache/meta.yaml. For this checkin, there is a tag called v0.1, but the package version, per the setup.py, is 0.0.1 - take care to use the v0.1 with the git_tag and the 0.0.1 for the version.

Also make the build and run dependency psana-conda

for the build script, we are copying what Murali figured out, i.e

    - pip install --no-deps --disable-pip-version-check .

you'll also see people do things like

  script: python setup.py install --single-version-externally-managed --record=record.txt

the main point is you want to do a standard python install, but not have pip or setuptools install dependencies - you list them in the build/run sections of your conda meta.yaml and let conda mange them.

Build Recipe

execute the command, from a machine with internet access,

(manage) (psreldev) psel701: /reg/g/psdm/sw/conda/manage/recipes/external $ ana-rel-admin --cmd bld-pkg --recipe pscache 

you could just do conda-build, but ana-rel-admin will also put the output in the rhelx channel and update the channel index.

Repeat that on a rhel5 and rhel6 machine (psdev106 and psdev105) this is not ideal since pure python will be the same, but for now we are just replicating all packages 3 times.

Update anarelyaml

I.e, an entry like

https://github.com/slaclab/anarel-manage/blob/master/config/anarel.yaml#L321

now we will add pscache to the conda environments we build.

Since pscache depends on psana, and psana is python 2.7 only, we'll follow the logbookclient pattern and specify that we skip it in the py3 environment.

update the package build order

As the number of packages we maintain grows, we need to automate building them all with one command. We also need to build them in the correct order. For example, if we update openmpi, then we will probably update mpi4py. We'd like to edit the two recipes, and then do one command to build openmpi and then mpi4py.

This config file

https://github.com/slaclab/anarel-manage/blob/master/config/pkg_build_order.yaml#L66

lists all the packages we need to build, the path to their recipe directories (relative to mange) in the order they should be built. Add an entry for pscache there (it won't matter where since pscache only depends on psana, the order is only for things like openmpi -> mpi4py, etc)

You can run

ana-rel-admin --cmd bld-all

it will go through the list and build anything that needs to be built.

Build new ana environments

We should be able to build a new ana environment now, there are separate instructions for that on confluence, or get in touch with admin

Developer Responsibility

maintain your repo pscache, update the git_tag in the conda recipe

new packages

Chuck says these will be good:

biopython

transitions

maybe
cbf from paulscherrerinstitute channel

update psana conda to latest boost, qt, and python 2.7.13 on conda-forge

Can we do this? To try, make a new build environment, look at dependencies here

https://github.com/slaclab/anarel-manage/blob/master/recipes/psana/psana-conda-opt/meta.yaml

and do

 conda create --name psana-update -c conda-forge "python>=2,<3" scons cython boost mysql ndarray qt numpy matplotlib scipy pyzmq openmpi=1.10.6=lsf_verbs_1 hdf5 h5py mpi4py=2.0.0=py27_openmpi_104  pytables

so that we get our openmpi and corresponding mpi4py, but we get most everything else from conda-forge. I did this as psreldev in the rhel7 --dev environment.

Then to build psana, one could and do, from your own account

cp -r /reg/g/psdm/sw/conda/scratch/psana-conda-1.2.7 .

to get our source, unfortunately this is not properly under git - we should extent the ana-rel-admin command to get source with full git checkout from master so we can update ite. Note the 'extpkgs' subdir with two packages (pdsdata and psalg) from /afs/svn

Building psana is a little complicated, one has to make a SConstruct file, a link to what is in SConsTools, define some environment variables, one also needs to define SIT_ARCH ahead of time, easiest is to run the build.sh in the recipe like this:

  1. source conda_setup --dev --env psana-update # activate my env with new packages
  2. SIT_ARCH=x86_64-rhel7-gcc48-opt bash /reg/g/psdm/sw/conda/manage/recipes/psana/psana-conda-opt/build.sh

anaconda-client missing from anarel.yaml

I'm building ana-1.2.0, it has a new conda, etc, and the anaconda program is not being found. In ana-1.1.0, it is still there, in package anaconda-client 1.6.0. I'm not sure what triggered the installation of anaconda-client in the ana-1.1.0 environment. I did not explicitly list it in the anarel.yaml.

I keep this in the manage environment to upload packages, I think it is useful for users, so I'll list it in anarel.yaml

rename our local build of 'tables' to use package name pytables

This is very confusing, what is the package name for pytables? The website reports pytables, but on pypi, it is tables, and that is what we've done for a while - but conda uses pytables. This creates problems, because if someone tries to meet psana-conda's dependencies by putting their own pytables in, we still install our tables.

I made this issue

ContinuumIO/anaconda-issues#1229

but I think conda has it correct, somebody made a mistake when they used the import name, tables, for the pypi package name.

Automate generation of jupyterhub kernels with new ana releases

Clemens has created kernels configuration that clears LD_LIBRARY_PATH and sets SIT_DATA, which depends on the specific conda environment (i.e, it is different for ana-1.1.0 and ana-1.0.7), as well as SIT_ROOT for ana-1.1.0.

Next I need to automate the generation of this when new environments are built.

All jupyterhub config will go at

/reg/g/psdm/sw/conda/jhub_config

there, for each conda installation we'll have a subdirectory, in particular

/reg/g/psdm/sw/conda/jhub_config/prod-rhel7

in that directory we'll have

/reg/g/psdm/sw/conda/jhub_config/prod-rhel7/kernels

that is a directory called kernels. That name is dictated by the jupyter stuff. In there, we'll have subdirs for each environment for which we define this kernel config, i.e

/reg/g/psdm/sw/conda/jhub_config/kernels/ana-1.1.0
/reg/g/psdm/sw/conda/jhub_config/kernels/ana-1.1.0-py3

Each of these subdirs will have a file called kernels.json, here is the ana-1.1.0 one:

{
 "display_name": "Python 2 ana-1.1.0", 
 "language": "python", 
 "argv": [
  "/reg/g/psdm/sw/conda/inst/miniconda2-prod-rhel7/envs/ana-1.1.0/bin/python", 
  "-m", 
  "ipykernel", 
  "-f", 
  "{connection_file}"
 ],
 "env": {"SIT_DATA": "/reg/g/psdm/sw/conda/inst/miniconda2-prod-rhel7/envs/ana-1.1.0/data:/reg/g/psdm/data",
         "SIT_ROOT": "/reg/g/psdm", "LD_LIBRARY_PATH": ""}
}

The python 3 one only varies with the display name, and we don't need to set SIT_ROOT or SIT_DATA, but I guess it can't hurt.

run conda-build from central install

You can't do this right now, conda-build doesn't support it, I made an issue conda/conda-build#1633 which is a duplicate of a related issue, following that I did something I would rather not have, made the directories

miniconda2-prod-rhel7/locks world readable

for all six installations, but it didn't help me?

Matplotlib dependency of packages in die anaconda repo

Hi,

I dont know if this is the right place to ask about the conda repo lcls-rhel7, if anybody knows a better plqce, please let me know..

The psana-conda package published in lcls-rhel7 has a matplotlib dependency where an exact build is specified (not only a version). This build is neither in the lcls-rhel7 repo, nor anaconda nor conda-forge - I cant find the specified build anywhere.

Whats the reasoning behind this dependency? Can you loosen it to a version or supply the matplotlib build as well?

Thanks
Felix

update conda-build and conda

there are new versions, but test everything, including installing from file based channels, getting weird errors on ocio machines with latest conda, index the file channels, etc

fix up env_vars.sh for ana-1.2.0-gpu

We just put cuda 8 on our GPU machine with the K40, both cuda 7.5 and cuda 8 are there, but cuda 8 is default. I need to manually go an update the environment variables that are set in this file. Maybe this file should be under version control? I'm changin the ld_library_path and path to use the /usr/local/cuda-7.5 dir

downgrade openmpi to 1.10 series

Chris has found that the 1.10 series resolves end of run crashes that we get with the 2.x series. Should get the latest in the 1.10 series, 1.10.6 I think?

conda_setup doesn't run on old python

on pswww, conda_setup won't source from the default system python. That python is 2.4.3, and it doesn't have argparse. The mechanism I put in place to find argparse isn't working, the argparse, if it is found, doesn't load - looks like language differences.

This is weird because https://pypi.python.org/pypi/argparse says argparse 1.4.0 is tested on python 2.4, so I'm not sure why the argparse I copied in isn't working.

In anycase, the simplest thing is just to run from a more modern version of python. Unfortunately, the simplest way to do that is to turn our one step instruction:

source /reg/g/psdm/bin/conda_setup

into two

source /reg/g/psdm/etc/ana_setup.sh
source conda_setup

just so that we can first get the python 2.7 that we built for rhel5/6/ or 7 on the path, before sourcing the conda_setup.

Or, we could work on getting conda_setup/argparse to work on the rhel5 machines, or port it to bash

ndarray 1.1.7 is not building

To build this right now, as psreldev, one goes to where I have checked out the git branch, and does

(manage) (psreldev) psel701: /reg/g/psdm/sw/conda/manage-git $ ana-rel-admin --cmd bld-pkg --recipe recipes/psana/ndarray

That is, go to manage-git, and run this admin command, but point to the recipe in the git branch. This has to be done on psel701, psdev105 and psdev106.

The error includes

Warning: failed to download source.  If building, will try again after downloading recipe dependencies.
Error was: 
Command '['/reg/g/psdm/sw/conda/inst/miniconda2-dev-rhel6/envs/ana-1.2.2/bin/git', 'checkout', '1.1.7']' returned non-zero exit status 1
Traceback (most recent call last):

do we have the tag correct?

soft links and user environments

And always_softlink option was added as part of #3870 & #3876 in conda 4.3.0. They closed issue 3308, so we should try to add this to the condarc's and see if we can not put 3GB in user home drives

More flexible ana-rel-admin command to update current (ana-current, or dm-current, etc)

@chrisvam and @slaclab/psdm suggests a more flexible ana-current mechanism. Right now the command is hard coded for ana environments. I've already made the directory structure

/reg/g/psdm/sw/conda/current

with a current/ana

subdir where I copied the ana-current files. Now, need to

  • remove the --cmd change-ana-current command
  • add a --cmd change-current command and use --variant option to specify ana or dm, or something.
  • switch the --cmd auto to use the new command
  • remove the old ana-current directory
  • important: update conda_setup to read ana-current from the new location

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.