qarik-group / bucc Goto Github PK
View Code? Open in Web Editor NEWThe fastest way to get a BUCC (BOSH, UAA Credhub and Concourse)
License: Apache License 2.0
The fastest way to get a BUCC (BOSH, UAA Credhub and Concourse)
License: Apache License 2.0
Inside bosh-deployment
repo I tried to add a cloud-config:
$ bosh update-cloud-config virtualbox/cloud-config.yml
Using environment '192.168.50.6' as user 'admin' (openid, bosh.admin)
Continue? [yN]: y
Updating cloud config:
Director responded with non-successful status code '500' response ''
it seems that wrong flag values are getting remembered to.
example
bucc up --cpi=virtualbox
in the state/flags
cpi=virtualbox
now running correct cmd
bucc up --cpi virtualbox
errror:
unsupported flag: --cpi=virtualbox
flags for '--cpi virtualbox' are: --oauth-providers --proxy
when running bucc clean
error:
cp: cannot stat '/home/ramonskie/workspace/bucc/ops/cpis/virtualbox/flags/cpi=virtualbox.yml': No such file or directory
cp: cannot stat '/home/ramonskie/workspace/bucc/ops/cpis/virtualbox/flags/cpi=virtualbox.yml': No such file or directory
cp: cannot stat '/home/ramonskie/workspace/bucc/ops/cpis/virtualbox/flags/cpi=virtualbox.yml': No such file or directory
cp: cannot stat '/home/ramonskie/workspace/bucc/ops/cpis/virtualbox/flags/cpi=virtualbox.yml': No such file or directory
cp: cannot stat '/home/ramonskie/workspace/bucc/ops/cpis/virtualbox/flags/cpi=virtualbox.yml': No such file or directory
cp: cannot stat '/home/ramonskie/workspace/bucc/ops/cpis/virtualbox/flags/cpi=virtualbox.yml': No such file or directory
cp: cannot stat '/home/ramonskie/workspace/bucc/ops/cpis/virtualbox/flags/cpi=virtualbox.yml': No such file or directory
cp: cannot stat '/home/ramonskie/workspace/bucc/ops/cpis/virtualbox/flags/cpi=virtualbox.yml': No such file or directory
'state' dir has been cleaned and 'vars.yml' has been moved to 'vars.yml.bck'
In another terminal:
$ which credhub
/usr/local/bin/credhub
In bucc window:
$ bucc credhub
installing credhub cli into: /Users/drnic/Projects/bosh_deployments/bucc/bin/
Setting the target url: https://192.168.50.6:8844
Login Successful
$ which credhub
/Users/drnic/Projects/bosh_deployments/bucc/bin/credhub
When I go into the bucc
project for the first time, then direnv allow
doesn't modify the PATH
environment variable:
$ git clone https://github.com/starkandwayne/bucc.git
...
$ cd bucc/
$ direnv allow
direnv: loading .envrc
Whereas if I just put eval "$(./bin/bucc env)"
in .envrc
then it works:
$ echo 'eval "$(./bin/bucc env)"' > .envrc
$ direnv allow
direnv: loading .envrc
direnv: export ~PATH
I didn't test this with the ${vars_store}
file being present yet.
But anyway, maybe you should just simplify your .envrc
and just leave eval "$(./bin/bucc env)"
for both bash
and zsh
, don't you think?
Cheers,
/Benjamin
Can a license file be added to this repository.
Added bucc helper commands to backup and restore the state of the BUCC vm.
These helper commands should used bosh bbr.
After running bucc up
to update an existing environment I'm getting:
vcap@lab:~/workspace/bucc$ fly -t bucc workers
name containers platform tags team state version
40730dcc-7171-40a6-442c-075795897c13 1 linux none none running 1.1
the following workers have not checked in recently:
name containers platform tags team state version
0aa2a9ea-4bb9-4d9b-4ce2-fdacd90c8e2c 0 linux none none stalled 1.1
76c3ee2d-de62-48ff-59ae-0f80bbf95abe 0 linux none none stalled 1.1
a6f0a9b9-c472-4217-6368-2c65201f6ce5 0 linux none none stalled 1.1
b3e15f0c-d4ab-4311-6a80-9228c064c628 1 linux none none stalled 1.1
these stalled workers can be cleaned up by running:
fly -t bucc prune-worker -w (name)
$ bucc uaac
script: illegal option -- -
usage: script [-adkpqr] [-t time] [file [command ...]]
$ uname -a
Darwin starkair.local 15.6.0 Darwin Kernel Version 15.6.0: Mon Jan 9 23:07:29 PST 2017; root:xnu-3248.60.11.2.1~1/RELEASE_X86_64 x86_64
Same for bucc credhub
:
$ bucc credhub
script: illegal option -- -
usage: script [-adkpqr] [-t time] [file [command ...]]
I've never seen script
before; I'm not sure what role its performing and can't guess what the osx flags might be.
bucc credhub
logs into credhub
and allows us to use credhub
anywhere; same for bucc fly
.
But we don't have a bucc bosh
that does the bosh log-in
. Can we please have this?
I did ./bin/bucc
at the root of this repo since I did not add bucc
to the PATH.
It will be nice to add more info about how to run bucc
command. like either run as ./bin/bucc
or add it to the PATH then we can just run bucc
in the root
git clone https://github.com/starkandwayne/bucc.git
cd bucc
bucc up
Results are:
cp: /Users/drnic/Projects/bosh_deployments/bucc/ops/0-jumpbox-user.yml: No such file or directory
cp: /Users/drnic/Projects/bosh_deployments/bucc/ops/1-garden-runc.yml: No such file or directory
cp: /Users/drnic/Projects/bosh_deployments/bucc/ops/2-uaa.yml: No such file or directory
cp: /Users/drnic/Projects/bosh_deployments/bucc/ops/3-credhub.yml: No such file or directory
cp: /Users/drnic/Projects/bosh_deployments/bucc/ops/cpis/virtualbox/0-cpi.yml: No such file or directory
cp: /Users/drnic/Projects/bosh_deployments/bucc/ops/cpis/virtualbox/0-outbound-network.yml: No such file or directory
Opening file /Users/drnic/Projects/bosh_deployments/bucc/src/bosh-deployment/bosh.yml:
open /Users/drnic/Projects/bosh_deployments/bucc/src/bosh-deployment/bosh.yml: no such file or directory
Exit code 1
Suggestion: automatically run git submodule update --init
if files are missing
Create a pipeline for integration testing bucc
work was already started here: https://github.com/starkandwayne/bucc/blob/develop/test/pipeline.yml
I think we would be interested in doing a PR for SoftLayer support. What are the requirements for adding a new IaaS to the tool?
What do you need from us to support this?
It would be nice to be able to resume a bucc deployed virtualbox vm.
Currently the bosh jobs won't be able to start because they are missing their persistent disk: cloudfoundry/bosh-virtualbox-cpi-release#7
below some things I have tried so far to workaround the above issue, with mixed results:
df /var/vcap/data | grep /var/vcap/data # wait for agent to mount /var/vcap/data
disk=$(sort <(cat /etc/mtab | grep /dev/sd | cut -d ' ' -f1) <(fdisk -l | grep /dev/sd | grep -v : | grep -v swap | cut -d ' ' -f1) | uniq -u)
mount ${disk} /var/vcap/store
monit
vboxmanage startvm $(bosh int state/state.json --path /current_vm_cid) --type headless
bucc ssh
cd /dev/disk/by-id/ && ln -s ../../sdc 1ATA
use uaa to authenticate with concourse
git-repos/bucc$ ./bin/bucc up --cpi aws --lite
Deployment manifest: '/mnt/c/Users/Aashni/git-repos/bucc/src/bosh-deployment/bosh.yml'
Deployment state: '/mnt/c/Users/Aashni/git-repos/bucc/state/state.json'
Started validating
Failed validating (00:00:00)
Parsing release set manifest '/mnt/c/Users/Aashni/git-repos/bucc/src/bosh-deployment/bosh.yml':
Evaluating manifest:
- Expected to find variables:
- private_key
Exit code 1
Expeting the private_key but i dont see the input in vars.yml file . Please advise.
Inside bucc dir following README:
bosh2 alias-env bucc
Then I go to one of my release repos:
$ bosh2 -e bucc stemcells
Using environment '192.168.50.6' as anonymous user
Finding stemcells:
Director responded with non-successful status code '401' response 'Not authorized: '/stemcells'
'
Exit code 1
https://ci.starkandwayne.com/teams/main/pipelines/bucc/jobs/upgrade-test/builds/29
less /var/vcap/sys/log/postgres/pre-start.stderr.log
+++ [[ -n /var/vcap/packages/postgres-9.6.4/lib ]]
+++ LD_LIBRARY_PATH=/var/vcap/packages/postgres-9.6.4/lib:/var/vcap/packages/postgres-9.6.4/lib
+ '[' -d /var/vcap/store/postgres/postgres-9.6.4 -a -f /var/vcap/store/postgres/POSTGRES_UPGRADE_LOCK ']'
+ '[' -d /var/vcap/store/postgres/postgres-unknown -a -f /var/vcap/store/postgres/postgres-unknown/postgresql.conf ']'
+ init_data_dir
+ '[' '!' -f /var/vcap/store/postgres/postgres-9.6.4/postgresql.conf ']'
+ su - vcap -c '/var/vcap/packages/postgres-9.6.4/bin/initdb -E utf8 --locale en_US.UTF-8 -D /var/vcap/store/postgres/postgres-9.6.4'
FATAL: could not open file "base/1/3597": No such file or directory
STATEMENT: DROP TABLE tmp_pg_description;
FATAL: could not open file "base/1/3597": No such file or directory
PANIC: could not open control file "global/pg_control": No such file or directory
Aborted
child process exited with exit code 134
initdb: removing data directory "/var/vcap/store/postgres/postgres-9.6.4"
could not open directory "/var/vcap/store/postgres/postgres-9.6.4": No such file or directory
initdb: failed to remove data directory
It works but would be nice, to check if bosh2
exists
$ bucc credhub
/home/vcap/workspace/bucc-lite/bin/bucc: line 8: bosh2: command not found
Warning: The targeted TLS certificate has not been verified for this connection.
Setting the target url: https://192.168.50.6:8844
Login Successful
$ bucc up
rm: /Users/drnic/Projects/bosh_deployments/bucc/state/manifests/*: No such file or directory
cp: directory /Users/drnic/Projects/bosh_deployments/bucc/state/manifests does not exist
find: /Users/drnic/Projects/bosh_deployments/bucc/state/manifests/*.yml: No such file or directory
usage: cp [-R [-H | -L | -P]] [-fi | -n] [-apvX] source_file target_file
cp [-R [-H | -L | -P]] [-fi | -n] [-apvX] source_file ... target_directory
find: /Users/drnic/Projects/bosh_deployments/bucc/state/manifests/*.yml: No such file or directory
usage: cp [-R [-H | -L | -P]] [-fi | -n] [-apvX] source_file target_file
cp [-R [-H | -L | -P]] [-fi | -n] [-apvX] source_file ... target_directory
find: /Users/drnic/Projects/bosh_deployments/bucc/state/manifests/*.yml: No such file or directory
Deployment manifest: '/Users/drnic/Projects/bosh_deployments/bucc/src/bosh-deployment/bosh.yml'
Deployment state: '/Users/drnic/Projects/bosh_deployments/bucc/state/state.json'
Started validating
Failed validating (00:00:00)
Parsing installation manifest '/Users/drnic/Projects/bosh_deployments/bucc/src/bosh-deployment/bosh.yml':
Validating installation manifest:
- cloud_provider.template.name must be provided
- cloud_provider.template.release must be provided
- cloud_provider.template.release '' must refer to a release in releases
$ /Users/Norman/Projects/bosh-env-bucc-vbox/bucc/bin/bucc: line 4: realpath: command not found
-bash: /Users/Norman/Projects/bosh-env-bucc-vbox/bucc/bin/bucc:: No such file or directory
$ usage: dirname path
-bash: usage:: command not found
I attempted to use readlink and stat commands it still failed.
I ended up just using dirname $0
How to add openstack keystone v2 API support ?
An ops file keystone-v2.yml already exists in bosh-deployment. But how to enable it with bucc ? Do we need to create a new CPI with this ops file included ?
Since the '-lite' aspect is now optional vars should not be hardcoded.
https://ci.starkandwayne.com/teams/main/pipelines/bucc/jobs/compile-releases/builds/202
try deploy with:
azs:
- name: z1
- name: z2
- name: z3
vm_types:
- name: default
vm_extensions:
- cloud_properties:
ports:
- 22/tcp
name: all_ports
compilation:
az: z1
network: default
reuse_compilation_vms: true
vm_type: default
workers: 5
networks:
- name: default
subnets:
- azs:
- z1
- z2
- z3
cloud_properties:
name: director_network
dns:
- 8.8.8.8
gateway: 10.245.0.1
range: 10.245.0.0/16
static:
- 10.245.0.34
type: manual
disk_types:
- disk_size: 1024
name: default
stemcells:
- alias: large
os: ubuntu-trusty
version: '3468.11'
releases:
- name: os-conf
sha1: 78d79f08ff5001cc2a24f572837c7a9c59a0e796
url: https://bosh.io/d/github.com/cloudfoundry/os-conf-release?v=18
version: '18'
- name: concourse
sha1: 20e17e3ac079f1b1a5095329a4fb14de40317bb9
url: https://bosh.io/d/github.com/concourse/concourse?v=3.7.0
version: 3.7.0
- name: credhub-importer
sha1: a945710d1d7d3dfb8f8e117aa6f81e2fb5c70709
url: https://github.com/cloudfoundry-community/credhub-importer-boshrelease/releases/download/1/credhub-importer-1.tgz
version: '1'
update:
canaries: 1
canary_watch_time: 1000 - 90000
max_in_flight: 1
update_watch_time: 1000 - 90000
instance_groups: []
name: bucc-compiled-releases
the run bosh export-release -d bucc-compiled-releases os-conf/18 ubuntu-trusty/3468.11
It would be nice if releases and compiled cpi artifacts would be cached between builds
I only have bosh 2.0.1 since I totally get rid of bosh cli v1. I use bosh
instead of bosh2
as my command. When I run source < (bucc env)
, it did not successfully export all the environments variables it needs since bucc env
error out ./bin/bucc: line 8: bosh2: command not found
before those export
clauses
Greetings,
I've been using and liking BUCC for a while, but now have a situation where I want to deploy a new instance, and set it up so that I can test SAML integration between UAA (acting as Service Provider (SP) and several Identity Providers (IDP).
I think that in order to do that I need to have a UAA with a real SSL Certificate, and therefore that I'd like to provision my UAA so that it's front-ended by an ELB/ALB that's using certs issued by AWS Certificate Manager. Rationale: I'm in testing, don't want to incur the cost of certs, and AWS Certificate Manager will issue them for free...
I've been bashing my brains out trying to RTFM, and not getting very far. So - a couple of questions:
Sorry for the length of this description. Post back if you need add'l detail. Any advice greatly appreciated!!!
Currently compile-releases does some manifest magic to figure out what needs to be compiled.
The this leads to duplicate compiles since there can be commits between the compile start and the update-ops file step.
We could solve this by doing something like:
curl "https://s3-eu-west-1.amazonaws.com/bucc-compiled-releases/?prefix=concourse" | grep "3.2.1-ubuntu-trusty-3421.9"
Hi, we're looking at BUCC as an option for a simple bootstrap of our platform, and we want to run it inside a single server using the Docker CPI. We can't use the Virtualbox-based BOSH Lite because we're running on AWS EC2 VMs, and running BUCC directly on our AWS environment would take more setup than makes sense for a bootstrap.
I had a go using BUCC with Docker, but hit this error:
cloudfoundry/bosh-deployment#93
It's hard to debug this because there's zero documentation on how the Docker CPI is even intended to be used, so I don't even know if I'm just "holding it wrong". Do you have a working example of running BUCC using the Docker CPI?
When running the most current version of bucc on MacOS, it hangs indefinitely while trying to download the concourse release.
โ ./bin/bucc up --lite
Using flags: --cpi virtualbox --lite
Flag files: '/Users/loaner/workspace/bucc/state/{cpi,lite,flags}'
Deployment manifest: '/Users/loaner/workspace/bucc/src/bosh-deployment/bosh.yml'
Deployment state: '/Users/loaner/workspace/bucc/state/state.json'
Started validating
Downloading release 'bosh'... Skipped [Found in local cache] (00:00:00)
Validating release 'bosh'... Finished (00:00:00)
Downloading release 'bosh-virtualbox-cpi'... Skipped [Found in local cache] (00:00:00)
Validating release 'bosh-virtualbox-cpi'... Finished (00:00:00)
Downloading release 'os-conf'... Skipped [Found in local cache] (00:00:00)
Validating release 'os-conf'... Finished (00:00:00)
Downloading release 'bosh-warden-cpi'... Skipped [Found in local cache] (00:00:00)
Validating release 'bosh-warden-cpi'... Finished (00:00:00)
Downloading release 'garden-runc'... Skipped [Found in local cache] (00:00:00)
Validating release 'garden-runc'... Finished (00:00:00)
Downloading release 'uaa'... Skipped [Found in local cache] (00:00:00)
Validating release 'uaa'... Finished (00:00:00)
Downloading release 'concourse'... <------------------------------ Stays here forever
Any ideas how I can debug this?
anyone else experienced this issue?
curl 7.47.0
https://github.com/starkandwayne/bucc/blob/master/bin/bucc#L240
606f18e#diff-bae68b14f9eac7684d127fde96e76412
Assumption: direnv installed
$ git clone [email protected]:starkandwayne/bucc.git new-bucc
Cloning into 'new-bucc'...
remote: Counting objects: 3043, done.
remote: Compressing objects: 100% (4/4), done.
remote: Total 3043 (delta 0), reused 12 (delta 0), pack-reused 3031
Receiving objects: 100% (3043/3043), 2.95 MiB | 1.13 MiB/s, done.
Resolving deltas: 100% (1542/1542), done.
$ cd new-bucc/
direnv: error .envrc is blocked. Run direnv allow
to approve its content.
$ direnv allow
direnv: loading .envrc
$ bucc
-bash: bucc: command not found
BUCC env will not set up the environment until there is a credential store. Maybe the PATH could be setup unconditionally while the remaining variables are set up when the credential store store exists.
We could also correct using documentation by stating you will need type bin/bucc until you bucc up.
BUCC was installed under the user ubuntu.
Ubuntu creates unique groups for each user by default.
The user accounts norm and ofir was created so we can separate github credentials and workspaces.
direnv was enabled for all accounts
I added users norm and ofir to the ubuntu group to see if group access sharing would work.
then I changed directory to the bucc directory.
The following errors was created.
norm@ip-10-1-0-244:~$ cd ~ubuntu/dev/bucc/
direnv: loading .envrc
./bin/bucc: line 353: /home/ubuntu/dev/bucc/state/ssh.key: Permission denied
chmod: changing permissions of '/home/ubuntu/dev/bucc/state/ssh.key': Operation not permitted
direnv: export +BOSH_CA_CERT +BOSH_CLIENT +BOSH_CLIENT_SECRET +BOSH_ENVIRONMENT +BOSH_GW_HOST +BOSH_GW_PRIVATE_KEY +BOSH_GW_USER ~PATH
Is there a better way than sharing one account for BUCC?
$ bucc fly
fly version (2.7.0) is out of sync with the target (2.6.0). to sync up, run the following:
fly -t bucc sync
expose login.oauth.providers
: https://github.com/cloudfoundry/uaa-release/blob/develop/jobs/uaa/spec#L817 via bucc vars.yml
Currently getting:
The request could not be completed because the credential does not exist or you do not have sufficient authorization.
Concourse relied on garden from bosh-lite.
Since lite is now optional we should make sure to install garden/runc
I downloaded + uploaded the warden stemcell but that failed:
$ bosh2 upload-stemcell ~/Downloads/bosh-stemcell-3363.12-warden-boshlite-ubuntu-trusty-go_agent.tgz
...
tar: ./dev/midi3: Cannot mknod: Operation not permitted
tar: ./dev/rmidi3: Cannot mknod: Operation not permitted
tar: ./dev/smpte3: Cannot mknod: Operation not permitted
tar: ./dev/agpgart: Cannot mknod: Operation not permitted
tar: Exiting with failure status due to previous errors
': exit status 2' in 'create_stemcell' CPI method
add a bucc clean
command so we cleanup our state files.
we also need to add a message that you can use the clean
option after a bucc down
credhub_encryption_password will be generated by the bosh cli and ends up in state/creds.yml
. We can use bbr to make backups of credhub, but we can't restore our secrets without credhub_encryption_password
.
A potential solution could be to create a bosh release with a job which just puts some secrets in a file, which then can be backuped with bbr. This way we can ensure our secret stays in sync with our backups.
For this to work we need bucc to scp this cred.yml file from the BUCC vm to state/creds.yml before doing a bosh create-env
.
With the above in place a restore would do the following:
In order to upgrade to concourse 3.6.0 we have to upgrade postgres by switching to postgres-release.
After: 4b1f11a...1e4aeba
Deploying:
Building state for instance 'bosh/0':
Rendering job templates for instance 'bosh/0':
Rendering templates for job 'atc/9617fe9fc54a930f0e6af44fe639ab416b7f4204':
Rendering template src: token_signing_key.erb, dst: config/token_signing_key:
Rendering template src: /home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/bosh-release-job486234601/templates/token_signing_key.erb, dst: /home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/rendered-jobs215684276/config/token_signing_key:
Running ruby to render templates:
Running command: 'ruby /home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/erb-renderer851543046/erb-render.rb /home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/erb-renderer851543046/erb-context.json /home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/bosh-release-job486234601/templates/token_signing_key.erb /home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/rendered-jobs215684276/config/token_signing_key', stdout: '', stderr: '/home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/erb-renderer851543046/erb-render.rb:189:in `rescue in render': Error filling in template '/home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/bosh-release-job486234601/templates/token_signing_key.erb' for atc/0 (line 1: #<TemplateEvaluationContext::UnknownProperty: Can't find property 'token_signing_key.private_key'>) (RuntimeError)
from /home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/erb-renderer851543046/erb-render.rb:175:in `render'
from /home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/erb-renderer851543046/erb-render.rb:200:in `<main>'
':
exit status 1
Exit code 1
https://ci.starkandwayne.com/teams/main/pipelines/bucc/jobs/test/builds/238
Currently my bosh has the name Bosh Lite Director
which then turns up in credhub keys:
credhub find
Name Updated Date
/Bosh Lite Director/identity-idp/backend-email-sender 2017-05-05T00:58:14Z
/Bosh Lite Director/identity-idp/backend-email-port 2017-05-05T00:58:13Z
How do I change this?
Also, when running bucc up
can I pass more operator patch files?
I updated to latest and direnv allow
/ bucc env
fails as:
$ bucc env
cat: /Users/drnic/Projects/bosh_deployments/bucc/state/lite: No such file or directory
/Users/drnic/Projects/bosh_deployments/bucc/bin/bucc: line 57: [: =: unary operator expected
cat: /Users/drnic/Projects/bosh_deployments/bucc/state/cpi: No such file or directory
find: /Users/drnic/Projects/bosh_deployments/bucc/ops/cpis//*.yml: No such file or directory
export BOSH_ENVIRONMENT=192.168.50.6
export BOSH_CA_CERT='/Users/drnic/Projects/bosh_deployments/bucc/state/ca.pem'
export BOSH_CLIENT=admin
cat: /Users/drnic/Projects/bosh_deployments/bucc/state/lite: No such file or directory
/Users/drnic/Projects/bosh_deployments/bucc/bin/bucc: line 57: [: =: unary operator expected
cat: /Users/drnic/Projects/bosh_deployments/bucc/state/cpi: No such file or directory
find: /Users/drnic/Projects/bosh_deployments/bucc/ops/cpis//*.yml: No such file or directory
export BOSH_CLIENT_SECRET=qvlnocqup9u94liagp51
Currently not working because of failing garden:
No supported storage backend found
Is this the best way to SSH into an instance using the BOSH as a gateway? Other ideas?
bosh ssh <job> --gw-user jumpbox --gw-private-key state/ssh.key --gw-host $(bosh2 int state/vars.yml --path /internal_ip)
NOTE: this assumes bucc ssh
has already been run so as to create state/ssh.key
On my ubuntu jumpbox, bucc credhub
is failing to download credhub
1.5.1 (and happily using the one already installed):
installing credhub cli into: /var/vcap/store/home/jumpbox/workspace/snw-deployments/aws-community-env/src/bucc/bin/
gzip: stdin: unexpected end of file
tar: Child returned status 1
tar: Error is not recoverable: exiting now
chmod: cannot access 'credhub': No such file or directory
mv: cannot stat 'credhub': No such file or directory
Setting the target url: https://10.10.1.4:8844
Login Successful
This will allow us to use those creds in concourse once concourse credhub integration is finished
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.