Code Monkey home page Code Monkey logo

bucc's People

Contributors

bgandon avatar craigdbarber avatar damzog avatar daniellavoie avatar dashaun avatar drnic avatar fenech avatar genevieve avatar gsiener avatar jeffgbutler avatar jyriok avatar lnguyen avatar lucaspinto avatar martyca avatar matthewcosgrove avatar mdhender avatar mogul avatar nouseforaname avatar philippekhalife avatar ramonskie avatar rkoster avatar stevewallcgi avatar teancom avatar warroyo avatar xiujiao avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

bucc's Issues

update-cloud-config failed

Inside bosh-deployment repo I tried to add a cloud-config:

$ bosh update-cloud-config virtualbox/cloud-config.yml
Using environment '192.168.50.6' as user 'admin' (openid, bosh.admin)

Continue? [yN]: y

Updating cloud config:
  Director responded with non-successful status code '500' response ''

putting in wrong flag values getting remembered to

it seems that wrong flag values are getting remembered to.
example
bucc up --cpi=virtualbox
in the state/flags
cpi=virtualbox

now running correct cmd
bucc up --cpi virtualbox
errror:

unsupported flag: --cpi=virtualbox
flags for '--cpi virtualbox' are: --oauth-providers --proxy

when running bucc clean
error:

cp: cannot stat '/home/ramonskie/workspace/bucc/ops/cpis/virtualbox/flags/cpi=virtualbox.yml': No such file or directory
cp: cannot stat '/home/ramonskie/workspace/bucc/ops/cpis/virtualbox/flags/cpi=virtualbox.yml': No such file or directory
cp: cannot stat '/home/ramonskie/workspace/bucc/ops/cpis/virtualbox/flags/cpi=virtualbox.yml': No such file or directory
cp: cannot stat '/home/ramonskie/workspace/bucc/ops/cpis/virtualbox/flags/cpi=virtualbox.yml': No such file or directory
cp: cannot stat '/home/ramonskie/workspace/bucc/ops/cpis/virtualbox/flags/cpi=virtualbox.yml': No such file or directory
cp: cannot stat '/home/ramonskie/workspace/bucc/ops/cpis/virtualbox/flags/cpi=virtualbox.yml': No such file or directory
cp: cannot stat '/home/ramonskie/workspace/bucc/ops/cpis/virtualbox/flags/cpi=virtualbox.yml': No such file or directory
cp: cannot stat '/home/ramonskie/workspace/bucc/ops/cpis/virtualbox/flags/cpi=virtualbox.yml': No such file or directory
'state' dir has been cleaned and 'vars.yml' has been moved to 'vars.yml.bck'

Why install credhub if local machine already has it installed?

In another terminal:

$ which credhub
/usr/local/bin/credhub

In bucc window:

$ bucc credhub
installing credhub cli into: /Users/drnic/Projects/bosh_deployments/bucc/bin/
Setting the target url: https://192.168.50.6:8844
Login Successful
$ which credhub
/Users/drnic/Projects/bosh_deployments/bucc/bin/credhub

Strangely, "source <(...)" in .envrc doesn't work with my Bash v3.2.57

When I go into the bucc project for the first time, then direnv allow doesn't modify the PATHenvironment variable:

$ git clone https://github.com/starkandwayne/bucc.git
...
$ cd bucc/
$ direnv allow
direnv: loading .envrc

Whereas if I just put eval "$(./bin/bucc env)" in .envrc then it works:

$ echo 'eval "$(./bin/bucc env)"' > .envrc
$ direnv allow
direnv: loading .envrc
direnv: export ~PATH

I didn't test this with the ${vars_store} file being present yet.

But anyway, maybe you should just simplify your .envrc and just leave eval "$(./bin/bucc env)" for both bash and zsh, don't you think?

Cheers,
/Benjamin

bucc backup/restore

Added bucc helper commands to backup and restore the state of the BUCC vm.
These helper commands should used bosh bbr.

Stalled concourse workers after bucc update

After running bucc up to update an existing environment I'm getting:

vcap@lab:~/workspace/bucc$ fly -t bucc workers
name                                  containers  platform  tags  team  state    version
40730dcc-7171-40a6-442c-075795897c13  1           linux     none  none  running  1.1


the following workers have not checked in recently:

name                                  containers  platform  tags  team  state    version
0aa2a9ea-4bb9-4d9b-4ce2-fdacd90c8e2c  0           linux     none  none  stalled  1.1
76c3ee2d-de62-48ff-59ae-0f80bbf95abe  0           linux     none  none  stalled  1.1
a6f0a9b9-c472-4217-6368-2c65201f6ce5  0           linux     none  none  stalled  1.1
b3e15f0c-d4ab-4311-6a80-9228c064c628  1           linux     none  none  stalled  1.1

these stalled workers can be cleaned up by running:

    fly -t bucc prune-worker -w (name)

bucc uaac failed

$ bucc uaac
script: illegal option -- -
usage: script [-adkpqr] [-t time] [file [command ...]]

$ uname -a
Darwin starkair.local 15.6.0 Darwin Kernel Version 15.6.0: Mon Jan  9 23:07:29 PST 2017; root:xnu-3248.60.11.2.1~1/RELEASE_X86_64 x86_64

Same for bucc credhub:

$ bucc credhub
script: illegal option -- -
usage: script [-adkpqr] [-t time] [file [command ...]]

I've never seen script before; I'm not sure what role its performing and can't guess what the osx flags might be.

Improving readme

I did ./bin/bucc at the root of this repo since I did not add bucc to the PATH.
It will be nice to add more info about how to run bucc command. like either run as ./bin/bucc or add it to the PATH then we can just run bucc in the root

`bucc up` fails on new repo

git clone https://github.com/starkandwayne/bucc.git
cd bucc
bucc up

Results are:

cp: /Users/drnic/Projects/bosh_deployments/bucc/ops/0-jumpbox-user.yml: No such file or directory
cp: /Users/drnic/Projects/bosh_deployments/bucc/ops/1-garden-runc.yml: No such file or directory
cp: /Users/drnic/Projects/bosh_deployments/bucc/ops/2-uaa.yml: No such file or directory
cp: /Users/drnic/Projects/bosh_deployments/bucc/ops/3-credhub.yml: No such file or directory
cp: /Users/drnic/Projects/bosh_deployments/bucc/ops/cpis/virtualbox/0-cpi.yml: No such file or directory
cp: /Users/drnic/Projects/bosh_deployments/bucc/ops/cpis/virtualbox/0-outbound-network.yml: No such file or directory
Opening file /Users/drnic/Projects/bosh_deployments/bucc/src/bosh-deployment/bosh.yml:
  open /Users/drnic/Projects/bosh_deployments/bucc/src/bosh-deployment/bosh.yml: no such file or directory

Exit code 1

Suggestion: automatically run git submodule update --init if files are missing

What about SoftLayer CPI support?

I think we would be interested in doing a PR for SoftLayer support. What are the requirements for adding a new IaaS to the tool?

What do you need from us to support this?

Bucc resume (for virtualbox)

It would be nice to be able to resume a bucc deployed virtualbox vm.
Currently the bosh jobs won't be able to start because they are missing their persistent disk: cloudfoundry/bosh-virtualbox-cpi-release#7

below some things I have tried so far to workaround the above issue, with mixed results:

df /var/vcap/data | grep /var/vcap/data # wait for agent to mount /var/vcap/data
disk=$(sort <(cat /etc/mtab | grep /dev/sd | cut -d ' ' -f1) <(fdisk -l | grep /dev/sd | grep -v : | grep -v swap | cut -d ' ' -f1) | uniq -u)
mount ${disk} /var/vcap/store
monit



vboxmanage startvm $(bosh int state/state.json --path /current_vm_cid) --type headless
bucc ssh
cd /dev/disk/by-id/ && ln -s ../../sdc 1ATA

Expecting Private Key while submitting bucc up command

git-repos/bucc$ ./bin/bucc up --cpi aws --lite
Deployment manifest: '/mnt/c/Users/Aashni/git-repos/bucc/src/bosh-deployment/bosh.yml'
Deployment state: '/mnt/c/Users/Aashni/git-repos/bucc/state/state.json'
Started validating
Failed validating (00:00:00)
Parsing release set manifest '/mnt/c/Users/Aashni/git-repos/bucc/src/bosh-deployment/bosh.yml':
Evaluating manifest:
- Expected to find variables:
- private_key
Exit code 1

Expeting the private_key but i dont see the input in vars.yml file . Please advise.

`bosh2 alias-env bucc` doesn't work outside of `bucc` dir

Inside bucc dir following README:

bosh2 alias-env bucc

Then I go to one of my release repos:

$ bosh2 -e bucc stemcells
Using environment '192.168.50.6' as anonymous user

Finding stemcells:
  Director responded with non-successful status code '401' response 'Not authorized: '/stemcells'
'

Exit code 1

postgres pre-start failure

https://ci.starkandwayne.com/teams/main/pipelines/bucc/jobs/upgrade-test/builds/29

less /var/vcap/sys/log/postgres/pre-start.stderr.log
+++ [[ -n /var/vcap/packages/postgres-9.6.4/lib ]]
+++ LD_LIBRARY_PATH=/var/vcap/packages/postgres-9.6.4/lib:/var/vcap/packages/postgres-9.6.4/lib
+ '[' -d /var/vcap/store/postgres/postgres-9.6.4 -a -f /var/vcap/store/postgres/POSTGRES_UPGRADE_LOCK ']'
+ '[' -d /var/vcap/store/postgres/postgres-unknown -a -f /var/vcap/store/postgres/postgres-unknown/postgresql.conf ']'
+ init_data_dir
+ '[' '!' -f /var/vcap/store/postgres/postgres-9.6.4/postgresql.conf ']'
+ su - vcap -c '/var/vcap/packages/postgres-9.6.4/bin/initdb -E utf8 --locale en_US.UTF-8 -D /var/vcap/store/postgres/postgres-9.6.4'
FATAL:  could not open file "base/1/3597": No such file or directory
STATEMENT:  DROP TABLE tmp_pg_description;

FATAL:  could not open file "base/1/3597": No such file or directory
PANIC:  could not open control file "global/pg_control": No such file or directory
Aborted
child process exited with exit code 134
initdb: removing data directory "/var/vcap/store/postgres/postgres-9.6.4"
could not open directory "/var/vcap/store/postgres/postgres-9.6.4": No such file or directory
initdb: failed to remove data directory

bosh2: command not found

It works but would be nice, to check if bosh2 exists

$ bucc credhub
/home/vcap/workspace/bucc-lite/bin/bucc: line 8: bosh2: command not found
Warning: The targeted TLS certificate has not been verified for this connection.
Setting the target url: https://192.168.50.6:8844
Login Successful

bucc up fails

$ bucc up
rm: /Users/drnic/Projects/bosh_deployments/bucc/state/manifests/*: No such file or directory
cp: directory /Users/drnic/Projects/bosh_deployments/bucc/state/manifests does not exist
find: /Users/drnic/Projects/bosh_deployments/bucc/state/manifests/*.yml: No such file or directory
usage: cp [-R [-H | -L | -P]] [-fi | -n] [-apvX] source_file target_file
       cp [-R [-H | -L | -P]] [-fi | -n] [-apvX] source_file ... target_directory
find: /Users/drnic/Projects/bosh_deployments/bucc/state/manifests/*.yml: No such file or directory
usage: cp [-R [-H | -L | -P]] [-fi | -n] [-apvX] source_file target_file
       cp [-R [-H | -L | -P]] [-fi | -n] [-apvX] source_file ... target_directory
find: /Users/drnic/Projects/bosh_deployments/bucc/state/manifests/*.yml: No such file or directory
Deployment manifest: '/Users/drnic/Projects/bosh_deployments/bucc/src/bosh-deployment/bosh.yml'
Deployment state: '/Users/drnic/Projects/bosh_deployments/bucc/state/state.json'

Started validating
Failed validating (00:00:00)

Parsing installation manifest '/Users/drnic/Projects/bosh_deployments/bucc/src/bosh-deployment/bosh.yml':
  Validating installation manifest:
    - cloud_provider.template.name must be provided
    - cloud_provider.template.release must be provided
    - cloud_provider.template.release '' must refer to a release in releases

realpath does not exist on my mac

$ /Users/Norman/Projects/bosh-env-bucc-vbox/bucc/bin/bucc: line 4: realpath: command not found
-bash: /Users/Norman/Projects/bosh-env-bucc-vbox/bucc/bin/bucc:: No such file or directory
$ usage: dirname path
-bash: usage:: command not found

I attempted to use readlink and stat commands it still failed.

I ended up just using dirname $0

Openstack keystone v2 support

How to add openstack keystone v2 API support ?
An ops file keystone-v2.yml already exists in bosh-deployment. But how to enable it with bucc ? Do we need to create a new CPI with this ops file included ?

Refactor vars.yml

Since the '-lite' aspect is now optional vars should not be hardcoded.

Fix compile releases ci job

https://ci.starkandwayne.com/teams/main/pipelines/bucc/jobs/compile-releases/builds/202

try deploy with:

azs:
- name: z1
- name: z2
- name: z3

vm_types:
- name: default

vm_extensions:
- cloud_properties:
    ports:
    - 22/tcp
  name: all_ports

compilation:
  az: z1
  network: default
  reuse_compilation_vms: true
  vm_type: default
  workers: 5

networks:
- name: default
  subnets:
  - azs:
    - z1
    - z2
    - z3
    cloud_properties:
      name: director_network
    dns:
    - 8.8.8.8
    gateway: 10.245.0.1
    range: 10.245.0.0/16
    static:
    - 10.245.0.34
  type: manual

disk_types:
- disk_size: 1024
  name: default

stemcells:
- alias: large
  os: ubuntu-trusty
  version: '3468.11'

releases:
- name: os-conf
  sha1: 78d79f08ff5001cc2a24f572837c7a9c59a0e796
  url: https://bosh.io/d/github.com/cloudfoundry/os-conf-release?v=18
  version: '18'
- name: concourse
  sha1: 20e17e3ac079f1b1a5095329a4fb14de40317bb9
  url: https://bosh.io/d/github.com/concourse/concourse?v=3.7.0
  version: 3.7.0
- name: credhub-importer
  sha1: a945710d1d7d3dfb8f8e117aa6f81e2fb5c70709
  url: https://github.com/cloudfoundry-community/credhub-importer-boshrelease/releases/download/1/credhub-importer-1.tgz
  version: '1'

update:
  canaries: 1
  canary_watch_time: 1000 - 90000
  max_in_flight: 1
  update_watch_time: 1000 - 90000

instance_groups: []

name: bucc-compiled-releases

the run bosh export-release -d bucc-compiled-releases os-conf/18 ubuntu-trusty/3468.11

bosh2 error

I only have bosh 2.0.1 since I totally get rid of bosh cli v1. I use bosh instead of bosh2 as my command. When I run source < (bucc env) , it did not successfully export all the environments variables it needs since bucc env error out ./bin/bucc: line 8: bosh2: command not found before those export clauses

BUCC Deployment with AWS Cred Service SSL support

Greetings,

Background

I've been using and liking BUCC for a while, but now have a situation where I want to deploy a new instance, and set it up so that I can test SAML integration between UAA (acting as Service Provider (SP) and several Identity Providers (IDP).

Rationale

I think that in order to do that I need to have a UAA with a real SSL Certificate, and therefore that I'd like to provision my UAA so that it's front-ended by an ELB/ALB that's using certs issued by AWS Certificate Manager. Rationale: I'm in testing, don't want to incur the cost of certs, and AWS Certificate Manager will issue them for free...

Request

I've been bashing my brains out trying to RTFM, and not getting very far. So - a couple of questions:

  • How would I go about modifying Bucc configuration so that it's configured to work with the load balancer. All of the config files that I see and documentation that I've found assume self-signed certs that are embedded in the YML files. With AWS Certificate Manager, I'm pretty sure I can't get a copy of that information - it's buried inside the service and then provided behind the scenes to "Selected AWS Services" (like load balancer)
  • Does this make sense - setting up UAA with a load balancer that's using an AWS-issued cert to handle the SSL connection
  • Assuming that my UAA server is at: https://mycompany.net, what would the URL look liked to download the SAML metadata that I have to provide to my IDP's. I'm thinking: https://mycompany.net:8443/metadata/saml. Does that sound correct?

Sorry for the length of this description. Post back if you need add'l detail. Any advice greatly appreciated!!!

Compile-releases task should use s3 bucket as source of truth

Currently compile-releases does some manifest magic to figure out what needs to be compiled.

The this leads to duplicate compiles since there can be commits between the compile start and the update-ops file step.

We could solve this by doing something like:

curl "https://s3-eu-west-1.amazonaws.com/bucc-compiled-releases/?prefix=concourse" | grep "3.2.1-ubuntu-trusty-3421.9"

Using BUCC on Docker

Hi, we're looking at BUCC as an option for a simple bootstrap of our platform, and we want to run it inside a single server using the Docker CPI. We can't use the Virtualbox-based BOSH Lite because we're running on AWS EC2 VMs, and running BUCC directly on our AWS environment would take more setup than makes sense for a bootstrap.

I had a go using BUCC with Docker, but hit this error:
cloudfoundry/bosh-deployment#93

It's hard to debug this because there's zero documentation on how the Docker CPI is even intended to be used, so I don't even know if I'm just "holding it wrong". Do you have a working example of running BUCC using the Docker CPI?

bucc up hangs indefinitely on downloading concourse release

When running the most current version of bucc on MacOS, it hangs indefinitely while trying to download the concourse release.

โ†’ ./bin/bucc up --lite
Using flags: --cpi virtualbox --lite
Flag files: '/Users/loaner/workspace/bucc/state/{cpi,lite,flags}'

Deployment manifest: '/Users/loaner/workspace/bucc/src/bosh-deployment/bosh.yml'
Deployment state: '/Users/loaner/workspace/bucc/state/state.json'

Started validating
  Downloading release 'bosh'... Skipped [Found in local cache] (00:00:00)
  Validating release 'bosh'... Finished (00:00:00)
  Downloading release 'bosh-virtualbox-cpi'... Skipped [Found in local cache] (00:00:00)
  Validating release 'bosh-virtualbox-cpi'... Finished (00:00:00)
  Downloading release 'os-conf'... Skipped [Found in local cache] (00:00:00)
  Validating release 'os-conf'... Finished (00:00:00)
  Downloading release 'bosh-warden-cpi'... Skipped [Found in local cache] (00:00:00)
  Validating release 'bosh-warden-cpi'... Finished (00:00:00)
  Downloading release 'garden-runc'... Skipped [Found in local cache] (00:00:00)
  Validating release 'garden-runc'... Finished (00:00:00)
  Downloading release 'uaa'... Skipped [Found in local cache] (00:00:00)
  Validating release 'uaa'... Finished (00:00:00)
  Downloading release 'concourse'... <------------------------------ Stays here forever

Any ideas how I can debug this?

direnv does not set PATH after initial clone of the bucc directory

Assumption: direnv installed

$ git clone [email protected]:starkandwayne/bucc.git new-bucc
Cloning into 'new-bucc'...
remote: Counting objects: 3043, done.
remote: Compressing objects: 100% (4/4), done.
remote: Total 3043 (delta 0), reused 12 (delta 0), pack-reused 3031
Receiving objects: 100% (3043/3043), 2.95 MiB | 1.13 MiB/s, done.
Resolving deltas: 100% (1542/1542), done.
$ cd new-bucc/
direnv: error .envrc is blocked. Run direnv allow to approve its content.
$ direnv allow
direnv: loading .envrc
$ bucc
-bash: bucc: command not found

BUCC env will not set up the environment until there is a credential store. Maybe the PATH could be setup unconditionally while the remaining variables are set up when the credential store store exists.

We could also correct using documentation by stating you will need type bin/bucc until you bucc up.

What is the appropriate way to access bucc across user accounts?

BUCC was installed under the user ubuntu.

Ubuntu creates unique groups for each user by default.

The user accounts norm and ofir was created so we can separate github credentials and workspaces.

direnv was enabled for all accounts

I added users norm and ofir to the ubuntu group to see if group access sharing would work.

then I changed directory to the bucc directory.

The following errors was created.

norm@ip-10-1-0-244:~$ cd ~ubuntu/dev/bucc/
direnv: loading .envrc
./bin/bucc: line 353: /home/ubuntu/dev/bucc/state/ssh.key: Permission denied
chmod: changing permissions of '/home/ubuntu/dev/bucc/state/ssh.key': Operation not permitted
direnv: export +BOSH_CA_CERT +BOSH_CLIENT +BOSH_CLIENT_SECRET +BOSH_ENVIRONMENT +BOSH_GW_HOST +BOSH_GW_PRIVATE_KEY +BOSH_GW_USER ~PATH

Is there a better way than sharing one account for BUCC?

Upgrade to concourse 2.7.0

$ bucc fly
fly version (2.7.0) is out of sync with the target (2.6.0). to sync up, run the following:

    fly -t bucc sync

Add garden on no-lite branch

Concourse relied on garden from bosh-lite.
Since lite is now optional we should make sure to install garden/runc

Which stemcell?

I downloaded + uploaded the warden stemcell but that failed:

$ bosh2 upload-stemcell ~/Downloads/bosh-stemcell-3363.12-warden-boshlite-ubuntu-trusty-go_agent.tgz
...
tar: ./dev/midi3: Cannot mknod: Operation not permitted
tar: ./dev/rmidi3: Cannot mknod: Operation not permitted
tar: ./dev/smpte3: Cannot mknod: Operation not permitted
tar: ./dev/agpgart: Cannot mknod: Operation not permitted
tar: Exiting with failure status due to previous errors
': exit status 2' in 'create_stemcell' CPI method

add a clean command

add a bucc clean command so we cleanup our state files.

we also need to add a message that you can use the clean option after a bucc down

Find a way to backup the credhub_encryption_password

credhub_encryption_password will be generated by the bosh cli and ends up in state/creds.yml. We can use bbr to make backups of credhub, but we can't restore our secrets without credhub_encryption_password.

A potential solution could be to create a bosh release with a job which just puts some secrets in a file, which then can be backuped with bbr. This way we can ensure our secret stays in sync with our backups.

For this to work we need bucc to scp this cred.yml file from the BUCC vm to state/creds.yml before doing a bosh create-env.

With the above in place a restore would do the following:

  • Restore credhub db
  • Restore bucc creds.yml file inside /var/vcap/jobs/store/creds.yml
  • scp /var/vcap/jobs/store/creds.yml to state/creds.yml
  • bucc up (to apply the restored creds.yml)

Ci broken

After: 4b1f11a...1e4aeba

Deploying:
  Building state for instance 'bosh/0':
    Rendering job templates for instance 'bosh/0':
      Rendering templates for job 'atc/9617fe9fc54a930f0e6af44fe639ab416b7f4204':
        Rendering template src: token_signing_key.erb, dst: config/token_signing_key:
          Rendering template src: /home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/bosh-release-job486234601/templates/token_signing_key.erb, dst: /home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/rendered-jobs215684276/config/token_signing_key:
            Running ruby to render templates:
              Running command: 'ruby /home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/erb-renderer851543046/erb-render.rb /home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/erb-renderer851543046/erb-context.json /home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/bosh-release-job486234601/templates/token_signing_key.erb /home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/rendered-jobs215684276/config/token_signing_key', stdout: '', stderr: '/home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/erb-renderer851543046/erb-render.rb:189:in `rescue in render': Error filling in template '/home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/bosh-release-job486234601/templates/token_signing_key.erb' for atc/0 (line 1: #<TemplateEvaluationContext::UnknownProperty: Can't find property 'token_signing_key.private_key'>) (RuntimeError)
        from /home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/erb-renderer851543046/erb-render.rb:175:in `render'
        from /home/vcap/.bosh/installations/3debc025-5fbc-4f43-5cd7-0205cd465ba7/tmp/erb-renderer851543046/erb-render.rb:200:in `<main>'
':
                exit status 1

Exit code 1

https://ci.starkandwayne.com/teams/main/pipelines/bucc/jobs/test/builds/238

How do I add more operator files and/or rename bosh from "Bosh Lite Director"

Currently my bosh has the name Bosh Lite Director which then turns up in credhub keys:

credhub find
Name                                                            Updated Date
/Bosh Lite Director/identity-idp/backend-email-sender           2017-05-05T00:58:14Z
/Bosh Lite Director/identity-idp/backend-email-port             2017-05-05T00:58:13Z

How do I change this?

Also, when running bucc up can I pass more operator patch files?

bucc env - several "No such file or directory"

I updated to latest and direnv allow / bucc env fails as:

$ bucc env
cat: /Users/drnic/Projects/bosh_deployments/bucc/state/lite: No such file or directory
/Users/drnic/Projects/bosh_deployments/bucc/bin/bucc: line 57: [: =: unary operator expected
cat: /Users/drnic/Projects/bosh_deployments/bucc/state/cpi: No such file or directory
find: /Users/drnic/Projects/bosh_deployments/bucc/ops/cpis//*.yml: No such file or directory
export BOSH_ENVIRONMENT=192.168.50.6
export BOSH_CA_CERT='/Users/drnic/Projects/bosh_deployments/bucc/state/ca.pem'
export BOSH_CLIENT=admin
cat: /Users/drnic/Projects/bosh_deployments/bucc/state/lite: No such file or directory
/Users/drnic/Projects/bosh_deployments/bucc/bin/bucc: line 57: [: =: unary operator expected
cat: /Users/drnic/Projects/bosh_deployments/bucc/state/cpi: No such file or directory
find: /Users/drnic/Projects/bosh_deployments/bucc/ops/cpis//*.yml: No such file or directory
export BOSH_CLIENT_SECRET=qvlnocqup9u94liagp51

Better way to ssh into deployment instance?

Is this the best way to SSH into an instance using the BOSH as a gateway? Other ideas?

bosh ssh <job> --gw-user jumpbox --gw-private-key state/ssh.key --gw-host $(bosh2 int state/vars.yml --path /internal_ip)

NOTE: this assumes bucc ssh has already been run so as to create state/ssh.key

credhub download fails

On my ubuntu jumpbox, bucc credhub is failing to download credhub 1.5.1 (and happily using the one already installed):

installing credhub cli into: /var/vcap/store/home/jumpbox/workspace/snw-deployments/aws-community-env/src/bucc/bin/

gzip: stdin: unexpected end of file
tar: Child returned status 1
tar: Error is not recoverable: exiting now
chmod: cannot access 'credhub': No such file or directory
mv: cannot stat 'credhub': No such file or directory
Setting the target url: https://10.10.1.4:8844
Login Successful

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.