Code Monkey home page Code Monkey logo

jenkins-client-plugin's Issues

'incorrect namespace' error from create() with objects for other namespaces

This issue was recently raised on the OpenShift RedHat Email List and recreated here for comment/consideration.

I have a template that can be successfully processed and the objects created using oc from the command-line. The template is supposed to run in one namespace (let’s call it Y) but it creates secrets that are placed in another namespace/project (let’s call that X). The namespaces are managed the same user. Both namespaces exist and the following command, when run on the command-line, is valid and produces the expected results:

oc process -f <template-file> | oc create -f -

But you cannot do this in a Jenkins pipeline without getting an error. Here's my pipeline code...

openshift.withCluster("${CLUSTER}") {
    openshift.withProject(“${Y}") {
        def objs = openshift.process('—filename=<template-file>’)
        openshift.create(objs)
    }
}

When I do this I get the following error reported in the Jenkins Job output:

err=error: the namespace from the provided object “X" does not match the namespace “Y". You must pass '—namespace=X' to perform this operation., verb=create

How do you replicate the actions that appear to be legitimate from the command-line by using the pipeline plugin? The plugin appears to assume that the objects passed to create() must reside in the project namespace, but they do not have to.

Incidentally, I can work-around the problem by iterating through the list of objects created by the call to process(), i.e. by doing this...

def objs = openshift.process('—filename=<template-file>’)
for (obj in objs) {
    if (obj.metadata.namespace == “X") {
        openshift.create(obj, "—namespace=X")
    } else {
        openshift.create(obj)
    }
}

Shouldn't create(), like its command-line-counterpart, honour the object namespace?

Enable CA Verify with insecure-skip-tls-verify: true is broken

When having this configured in ~jenkins/.kube/config on the os level

- cluster:
    insecure-skip-tls-verify: true

I can not enable SSL verify on the cluster Level in Jenkins. I will always get:

  action failed: {reference={}, err=error: specifying a root certificates file with the insecure flag is not allowed, verb=get

This should be possible. The plugin should not depend on System configuration

  • Setting KUBECONFIG to something sane, or
  • overwriting insecure-skip-tls-verify always (even if false)) or (best)
  • allow setting ENV VARS at the global cluster level.

Workaround: Wrap in withEnv(["KUBECONFIG=${workspace}"]) - with deleteDir() in a finally block

Security: deny token retrieval

Hello,
right now you can use the following to retrieve the jenkins encrypted token:

openshift.raw("whoami","-t")

This should be blocked by the plugin to disable easy leakage of token information

NPE when missing object is not helpful to users

Whenever I forget to wrap a call in withCluster() or try to .object() on something that doesn't exist, I get:

java.lang.NullPointerException: Cannot invoke method isSkipTlsVerify() on null object
	at org.codehaus.groovy.runtime.NullObject.invokeMethod(NullObject.java:91)
	at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:48)
	at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
	at org.codehaus.groovy.runtime.callsite.NullCallSite.call(NullCallSite.java:35)
	at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
	at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
	at com.cloudbees.groovy.cps.sandbox.DefaultInvoker.methodCall(DefaultInvoker.java:19)
	at com.openshift.jenkins.plugins.OpenShiftDSL.buildCommonArgs(jar:file:/var/lib/jenkins/plugins/openshift-client/WEB-INF/lib/openshift-client.jar!/com/openshift/jenkins/plugins/OpenShiftDSL.groovy:368)
	at com.openshift.jenkins.plugins.OpenShiftDSL.buildCommonArgs(jar:file:/var/lib/jenkins/plugins/openshift-client/WEB-INF/lib/openshift-client.jar!/com/openshift/jenkins/plugins/OpenShiftDSL.groovy:351)
	at com.openshift.jenkins.plugins.OpenShiftDSL$OpenShiftResourceSelector._asSingleMap(jar:file:/var/lib/jenkins/plugins/openshift-client/WEB-INF/lib/openshift-client.jar!/com/openshift/jenkins/plugins/OpenShiftDSL.groovy:1069)
	at com.openshift.jenkins.plugins.OpenShiftDSL$OpenShiftResourceSelector.object(jar:file:/var/lib/jenkins/plugins/openshift-client/WEB-INF/lib/openshift-client.jar!/com/openshift/jenkins/plugins/OpenShiftDSL.groovy:1085)

This is not really a helpful stacktrace and if we could wrap it in some context that would help me remember to withCluster() or suggest that no object was selected it would be much more helpful.

/cc @gabemontero @bparees @jupierce

expanded environment variables in raw oc invocation

When using the build-step "generic oc invocation", environment variables are not expanded.

Build log output:

Executing: oc start-build testapp --from-dir=${OSE_BINARYBUILD_DIR} --server=https://xxxx --namespace=testapp-demo --token=XXXXX 
error: stat /var/jenkins_home/${OSE_BINARYBUILD_DIR}: no such file or directory
Client tool terminated with status: 1

Issue switching to another cluster

Hi, I changed the hostname in my cluster configuration in Jenkins and now when I run my pipeline I get this error:

ERROR: new-project returned an error;
{reference={}, err=error: unable to read certificate-authority /var/run/secrets/kubernetes.io/serviceaccount/ca.crt for oldhost.com:8443 due to open /var/run/secrets/kubernetes.io/serviceaccount/ca.crt: no such file or directory, verb=new-project, cmd=oc new-project petclinic-build --skip-config-write --server=https://newhost.com --token=XXXXX , out=, status=1}

I am using an external Jenkins instance outside Openshift

Error running commands outside withProject block

Hi, I am using the plugin version 0.9.5 with OC client 1.5.1 in a Jenkins server outside Openshift. When I try to run some commands like openshift.newProject or openshift.process outside a withProject block I get this error:

java.io.FileNotFoundException: /var/run/secrets/kubernetes.io/serviceaccount/namespace (No such file or directory)

If I put them inside a withProject block then they work.

facilitate CD team annotate scenario

@coreydaley - I've forwarded an email thread from aos-cicd-devops to you.

There were exchanges between @stevekuznetsov , @csrwng , @jupierce , and myself around a create imagestream flow, followed by an annotation of that flow, along with some permutations of that flow.

Minimally, let's add an annotate method on the resource selector (to may the label method).

But as you digest the email and we further iterate in this issue, we agree more more nuances we'll want to add.

@bparees fyi (I'm forwarding the email to you as well as reference).

Prevent linefeeds from being included on oc command line

When a linefeed is accidentally included in token value for a credential used by the plugin, the script generated to execute oc contains an unexecutable line with a portion of the token on it.

In ClientCommandBuild.java (constructor or buildCommand(..)), and exception should be thrown if token variable contains linefeeds \r or \n. Every other component of the command line could technically be checked as well, but they are less likely to contain a linefeed since they are simple input fields in the Jenkins UI instead of textareas.

How to verify deployment ?

Hi,

With https://github.com/openshift/jenkins-plugin, you can use the helper step openshiftVerifyDeployment to check wether the deployment actually full fit our specs. It's easy.

So how to do the same thing with Openshift client ?

For a build, the status is well expressed, we are waiting the build to be Completed and if so it means the build finish with success.
There is no such status for Deployment making it harder to verify.

Regards,

Command not found when using tag action with insecure

OpenShift Version: v3.6.173.0.21
Jenkins Version: 2.46.3
Plugins: https://github.com/RHsyseng/jenkins-on-openshift/blob/master/jenkins/plugins.txt

Observations: It looks like --insecure-skip-tls-verify is on a newline from the output and the error.

Example Pipeline script section

script {
                    withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: "registry-api", usernameVariable: 'USERNAME', passwordVariable: 'PASSWORD']]) {
                        
                        openshift.withCluster('insecure://internal-registry.host.prod.eng.rdu2.redhat.com:8443', env.PASSWORD) {
                            openshift.withProject('lifecycle') {

                                env.VERSION = readFile('app/VERSION')
                                echo "Hello from project: ${openshift.project()}"

                                istag = openshift.selector('istag')
                                
                                echo "${istag}"
                                
                                println("${openshift.raw('status').out}")
                                
                                openshift.verbose()
                                openshift.tag("${openshift.project()}/${env.IMAGE_STREAM_NAME}:${env.GIT_COMMIT}", "${openshift.project()}/${env.IMAGE_STREAM_NAME}:${env.VERSION}")
                                openshift.verbose(false)
                                
                            }
                        }
                    }

                }

Output and error

Hello from project: lifecycle
[Pipeline] echo
selector([name=istag],[labels=null],[list=null])
[Pipeline] _OcAction
[Pipeline] echo
In project Software lifecycle (lifecycle) on server https://internal-registry.host.prod.eng.rdu2.redhat.com:8443

1 warning identified, use 'oc status -v' to see details.

[Pipeline] _OcAction
Verbose sub-step output:
	Command> oc tag lifecycle/nodejs-mongo-persistent:b9ba509 lifecycle/nodejs-mongo-persistent:1.0
 --insecure-skip-tls-verify --server=https://internal-registry.host.prod.eng.rdu2.redhat.com:8443 --namespace=lifecycle --loglevel=8 --token=XXXXX 
	Status> 127
	StdOut>
	StdErr> /var/lib/jenkins/jobs/jcallen/workspace@tmp/durable-a15b1de2/script.sh: line 3: --insecure-skip-tls-verify: command not found
	Reference> {}
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Stage - OpenShift DeploymentConfig)
Stage 'Stage - OpenShift DeploymentConfig' skipped due to earlier failure(s)
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Stage - Test)
Stage 'Stage - Test' skipped due to earlier failure(s)
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Production - Push Image)
Stage 'Production - Push Image' skipped due to earlier failure(s)
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: tag returned an error;
{reference={}, err=/var/lib/jenkins/jobs/jcallen/workspace@tmp/durable-a15b1de2/script.sh: line 3: --insecure-skip-tls-verify: command not found, verb=tag, cmd=oc tag lifecycle/nodejs-mongo-persistent:b9ba509 lifecycle/nodejs-mongo-persistent:1.0
 --insecure-skip-tls-verify --server=https://internal-registry.host.prod.eng.rdu2.redhat.com:8443 --namespace=lifecycle --loglevel=8 --token=XXXXX , out=, status=127}

Add support for "oc rsh"

It seems that "oc rsh" is not working with theopenshift.raw() interface of the plugin:

Pipeline:

node {
  def ocDir = tool "oc"
  withEnv(["PATH+OC=${ocDir}"]) {
  openshift.withCluster('minishift') {
    openshift.withProject('myproject') {
      echo "HELLO FROM ${openshift.project()}"
      echo openshift.raw('version').out
      echo openshift.raw('get','pod', 'petclinic2-3-hjx72').out
      echo openshift.raw("rsh","petclinic2-3-hjx72","hostname").out
    }
  }
  }
}

I get the following:

Started by user admin
[Pipeline] node
Running on Jenkins in /var/lib/jenkins/workspace/openshift-test
[Pipeline] {
[Pipeline] tool
[Pipeline] withEnv
[Pipeline] {
[Pipeline] echo

[Pipeline] _OcContextInit
[Pipeline] _OcContextInit
[Pipeline] echo
HELLO FROM myproject
[Pipeline] _OcAction
[Pipeline] echo
oc v3.6.0+c4dd4cf
kubernetes v1.6.1+5115d708d7
features: Basic-Auth GSSAPI Kerberos SPNEGO

Server https://xxx.xxx.xxx.xxx:8443/
openshift v3.6.0+c4dd4cf
kubernetes v1.6.1+5115d708d7

[Pipeline] _OcAction
[Pipeline] echo
NAME                 READY     STATUS    RESTARTS   AGE
petclinic2-3-hjx72   1/1       Running   4          39d

[Pipeline] _OcAction
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: raw command [rsh, petclinic2-3-hjx72, hostname] returned an error;
{reference={}, err=Error from server (NotFound): pods "petclinic2-3-hjx72" not found, verb=, cmd=oc  rsh petclinic2-3-hjx72 hostname --insecure-skip-tls-verify --server=https://xxx.xxx.xxx.xxx:8443/ --namespace=myproject --token=XXXXX , out=, status=1}

Finished: FAILURE

Groovy DSL support seems to missing

While getting pipeline syntax with url like http://localhost:8080/job/pipetti/pipeline-syntax/ and then clicking "Intellij IDEA GDSL", I get the text file which sadly do not reveal the openshift API like openshift.withCluster(), only the global variable openshift ; related to your client-plugin API.

Instead, the legacy openshift pipeline plugin reveals methods like openshiftBuild(), etc.
Having the proper and full gdsl file makes the editing of openshift related APIS quicker and effective.

The related gdsl is shown below:

//The global script scope
def ctx = context(scope: scriptScope())
contributor(ctx) {
method(name: 'build', type: 'Object', params: [job:'java.lang.String'], doc: 'Build a job')
method(name: 'build', type: 'Object', namedParams: [parameter(name: 'job', type: 'java.lang.String'), parameter(name: 'parameters', type: 'Map'), parameter(name: 'propagate', type: 'boolean'), parameter(name: 'quietPeriod', type: 'java.lang.Integer'), parameter(name: 'wait', type: 'boolean'), ], doc: 'Build a job')
method(name: 'echo', type: 'Object', params: [message:'java.lang.String'], doc: 'Print Message')
method(name: 'emailext', type: 'Object', namedParams: [parameter(name: 'subject', type: 'java.lang.String'), parameter(name: 'body', type: 'java.lang.String'), parameter(name: 'attachLog', type: 'boolean'), parameter(name: 'attachmentsPattern', type: 'java.lang.String'), parameter(name: 'compressLog', type: 'boolean'), parameter(name: 'from', type: 'java.lang.String'), parameter(name: 'mimeType', type: 'java.lang.String'), parameter(name: 'postsendScript', type: 'java.lang.String'), parameter(name: 'presendScript', type: 'java.lang.String'), parameter(name: 'recipientProviders', type: 'Map'), parameter(name: 'replyTo', type: 'java.lang.String'), parameter(name: 'to', type: 'java.lang.String'), ], doc: 'Extended Email')
method(name: 'emailextrecipients', type: 'Object', params: [recipientProviders:'Map'], doc: 'Extended Email Recipients')
method(name: 'error', type: 'Object', params: [message:'java.lang.String'], doc: 'Error signal')
method(name: 'input', type: 'Object', params: [message:'java.lang.String'], doc: 'Wait for interactive input')
method(name: 'input', type: 'Object', namedParams: [parameter(name: 'message', type: 'java.lang.String'), parameter(name: 'id', type: 'java.lang.String'), parameter(name: 'ok', type: 'java.lang.String'), parameter(name: 'parameters', type: 'Map'), parameter(name: 'submitter', type: 'java.lang.String'), parameter(name: 'submitterParameter', type: 'java.lang.String'), ], doc: 'Wait for interactive input')
method(name: 'isUnix', type: 'Object', params: [:], doc: 'Checks if running on a Unix-like node')
method(name: 'library', type: 'Object', params: [identifier:'java.lang.String'], doc: 'Load a shared library on the fly')
method(name: 'library', type: 'Object', namedParams: [parameter(name: 'identifier', type: 'java.lang.String'), parameter(name: 'changelog', type: 'java.lang.Boolean'), parameter(name: 'retriever', type: 'Map'), ], doc: 'Load a shared library on the fly')
method(name: 'libraryResource', type: 'Object', params: [resource:'java.lang.String'], doc: 'Load a resource file from a shared library')
method(name: 'mail', type: 'Object', namedParams: [parameter(name: 'subject', type: 'java.lang.String'), parameter(name: 'body', type: 'java.lang.String'), parameter(name: 'bcc', type: 'java.lang.String'), parameter(name: 'cc', type: 'java.lang.String'), parameter(name: 'charset', type: 'java.lang.String'), parameter(name: 'from', type: 'java.lang.String'), parameter(name: 'mimeType', type: 'java.lang.String'), parameter(name: 'replyTo', type: 'java.lang.String'), parameter(name: 'to', type: 'java.lang.String'), ], doc: 'Mail')
method(name: 'milestone', type: 'Object', params: [ordinal:'java.lang.Integer'], doc: 'The milestone step forces all builds to go through in order')
method(name: 'milestone', type: 'Object', namedParams: [parameter(name: 'ordinal', type: 'java.lang.Integer'), parameter(name: 'label', type: 'java.lang.String'), ], doc: 'The milestone step forces all builds to go through in order')
method(name: 'node', type: 'Object', params: [label:java.lang.String, body:'Closure'], doc: 'Allocate node')
method(name: 'properties', type: 'Object', params: [properties:'Map'], doc: 'Set job properties')
method(name: 'readTrusted', type: 'Object', params: [path:'java.lang.String'], doc: 'Read trusted file from SCM')
method(name: 'resolveScm', type: 'Object', namedParams: [parameter(name: 'source', type: 'Map'), parameter(name: 'targets', type: 'Map'), parameter(name: 'ignoreErrors', type: 'boolean'), ], doc: 'Resolves an SCM from an SCM Source and a list of candidate target branch names')
method(name: 'retry', type: 'Object', params: [count:int, body:'Closure'], doc: 'Retry the body up to N times')
method(name: 'script', type: 'Object', params: [body:'Closure'], doc: 'Run arbitrary Pipeline script')
method(name: 'sleep', type: 'Object', params: [time:'int'], doc: 'Sleep')
method(name: 'sleep', type: 'Object', namedParams: [parameter(name: 'time', type: 'int'), parameter(name: 'unit', type: 'java.util.concurrent.TimeUnit'), ], doc: 'Sleep')
method(name: 'stage', type: 'Object', params: [name:java.lang.String, body:'Closure'], doc: 'Stage')
method(name: 'stage', type: 'Object', params: [body:Closure], namedParams: [parameter(name: 'name', type: 'java.lang.String'), parameter(name: 'concurrency', type: 'java.lang.Integer'), ], doc: 'Stage')
method(name: 'timeout', type: 'Object', params: [time:int, body:'Closure'], doc: 'Enforce time limit')
method(name: 'timeout', type: 'Object', params: [body:Closure], namedParams: [parameter(name: 'time', type: 'int'), parameter(name: 'unit', type: 'java.util.concurrent.TimeUnit'), ], doc: 'Enforce time limit')
method(name: 'timestamps', type: 'Object', params: [body:'Closure'], doc: 'Timestamps')
method(name: 'tool', type: 'Object', params: [name:'java.lang.String'], doc: 'Use a tool from a predefined Tool Installation')
method(name: 'tool', type: 'Object', namedParams: [parameter(name: 'name', type: 'java.lang.String'), parameter(name: 'type', type: 'java.lang.String'), ], doc: 'Use a tool from a predefined Tool Installation')
method(name: 'waitUntil', type: 'Object', params: [body:'Closure'], doc: 'Wait for condition')
method(name: 'withCredentials', type: 'Object', params: [bindings:Map, body:'Closure'], doc: 'Bind credentials to variables')
method(name: 'withEnv', type: 'Object', params: [overrides:Map, body:'Closure'], doc: 'Set environment variables')
method(name: 'ws', type: 'Object', params: [dir:java.lang.String, body:'Closure'], doc: 'Allocate workspace')
method(name: 'catchError', type: 'Object', params: [body:'Closure'], doc: 'Advanced/Deprecated Catch error and set build result')
method(name: 'dockerFingerprintRun', type: 'Object', params: [containerId:'java.lang.String'], doc: 'Advanced/Deprecated Record trace of a Docker image run in a container')
method(name: 'dockerFingerprintRun', type: 'Object', namedParams: [parameter(name: 'containerId', type: 'java.lang.String'), parameter(name: 'toolName', type: 'java.lang.String'), ], doc: 'Record trace of a Docker image run in a container')
method(name: 'envVarsForTool', type: 'Object', namedParams: [parameter(name: 'toolId', type: 'java.lang.String'), parameter(name: 'toolVersion', type: 'java.lang.String'), ], doc: 'Fetches the environment variables for a given tool in a list of 'FOO=bar' strings suitable for the withEnv step.')
method(name: 'getContext', type: 'Object', params: [type:'Map'], doc: 'Advanced/Deprecated Get contextual object from internal APIs')
method(name: 'podTemplate', type: 'Object', params: [body:Closure], namedParams: [parameter(name: 'label', type: 'java.lang.String'), parameter(name: 'name', type: 'java.lang.String'), parameter(name: 'activeDeadlineSeconds', type: 'int'), parameter(name: 'annotations', type: 'Map'), parameter(name: 'cloud', type: 'java.lang.String'), parameter(name: 'containers', type: 'Map'), parameter(name: 'envVars', type: 'Map'), parameter(name: 'idleMinutes', type: 'int'), parameter(name: 'imagePullSecrets', type: 'Map'), parameter(name: 'inheritFrom', type: 'java.lang.String'), parameter(name: 'instanceCap', type: 'int'), parameter(name: 'namespace', type: 'java.lang.String'), parameter(name: 'nodeSelector', type: 'java.lang.String'), parameter(name: 'nodeUsageMode', type: 'java.lang.String'), parameter(name: 'serviceAccount', type: 'java.lang.String'), parameter(name: 'slaveConnectTimeout', type: 'int'), parameter(name: 'volumes', type: 'Map'), parameter(name: 'workingDir', type: 'java.lang.String'), parameter(name: 'workspaceVolume', type: 'Map'), ], doc: 'Define a podTemplate to use in the kubernetes plugin')
method(name: 'withContext', type: 'Object', params: [context:java.lang.Object, body:'Closure'], doc: 'Advanced/Deprecated Use contextual object from internal APIs within a block')
property(name: 'openshift', type: 'com.openshift.jenkins.plugins.pipeline.OpenShiftGlobalVariable')
property(name: 'docker', type: 'org.jenkinsci.plugins.docker.workflow.DockerDSL')
property(name: 'pipeline', type: 'org.jenkinsci.plugins.pipeline.modeldefinition.ModelStepLoader')
property(name: 'env', type: 'org.jenkinsci.plugins.workflow.cps.EnvActionImpl.Binder')
property(name: 'params', type: 'org.jenkinsci.plugins.workflow.cps.ParamsVariable')
property(name: 'currentBuild', type: 'org.jenkinsci.plugins.workflow.cps.RunWrapperBinder')
property(name: 'scm', type: 'org.jenkinsci.plugins.workflow.multibranch.SCMVar')
}
//Steps that require a node context
def nodeCtx = context(scope: closureScope())
contributor(nodeCtx) {
def call = enclosingCall('node')
if (call) {
method(name: 'bat', type: 'Object', params: [script:'java.lang.String'], doc: 'Windows Batch Script')
method(name: 'bat', type: 'Object', namedParams: [parameter(name: 'script', type: 'java.lang.String'), parameter(name: 'encoding', type: 'java.lang.String'), parameter(name: 'returnStatus', type: 'boolean'), parameter(name: 'returnStdout', type: 'boolean'), ], doc: 'Windows Batch Script')
method(name: 'checkout', type: 'Object', params: [scm:'Map'], doc: 'General SCM')
method(name: 'checkout', type: 'Object', namedParams: [parameter(name: 'scm', type: 'Map'), parameter(name: 'changelog', type: 'boolean'), parameter(name: 'poll', type: 'boolean'), ], doc: 'General SCM')
method(name: 'containerLog', type: 'Object', params: [name:'java.lang.String'], doc: 'Get container log from Kubernetes')
method(name: 'containerLog', type: 'Object', namedParams: [parameter(name: 'name', type: 'java.lang.String'), parameter(name: 'limitBytes', type: 'int'), parameter(name: 'returnLog', type: 'boolean'), parameter(name: 'sinceSeconds', type: 'int'), parameter(name: 'tailingLines', type: 'int'), ], doc: 'Get container log from Kubernetes')
method(name: 'deleteDir', type: 'Object', params: [:], doc: 'Recursively delete the current directory from the workspace')
method(name: 'dir', type: 'Object', params: [path:java.lang.String, body:'Closure'], doc: 'Change current directory')
method(name: 'fileExists', type: 'Object', params: [file:'java.lang.String'], doc: 'Verify if file exists in workspace')
method(name: 'git', type: 'Object', params: [url:'java.lang.String'], doc: 'Git')
method(name: 'git', type: 'Object', namedParams: [parameter(name: 'url', type: 'java.lang.String'), parameter(name: 'branch', type: 'java.lang.String'), parameter(name: 'changelog', type: 'boolean'), parameter(name: 'credentialsId', type: 'java.lang.String'), parameter(name: 'poll', type: 'boolean'), ], doc: 'Git')
method(name: 'load', type: 'Object', params: [path:'java.lang.String'], doc: 'Evaluate a Groovy source file into the Pipeline script')
method(name: 'openshiftBuild', type: 'Object', params: [bldCfg:'java.lang.String'], doc: 'Trigger OpenShift Build')
method(name: 'openshiftBuild', type: 'Object', namedParams: [parameter(name: 'bldCfg', type: 'java.lang.String'), parameter(name: 'apiURL', type: 'java.lang.String'), parameter(name: 'authToken', type: 'java.lang.String'), parameter(name: 'buildName', type: 'java.lang.String'), parameter(name: 'checkForTriggeredDeployments', type: 'java.lang.String'), parameter(name: 'commitID', type: 'java.lang.String'), parameter(name: 'env', type: 'Map'), parameter(name: 'namespace', type: 'java.lang.String'), parameter(name: 'showBuildLogs', type: 'java.lang.String'), parameter(name: 'verbose', type: 'java.lang.String'), parameter(name: 'waitTime', type: 'java.lang.String'), parameter(name: 'waitUnit', type: 'java.lang.String'), ], doc: 'Trigger OpenShift Build')
method(name: 'openshiftCreateResource', type: 'Object', params: [jsonyaml:'java.lang.String'], doc: 'Create OpenShift Resource(s)')
method(name: 'openshiftCreateResource', type: 'Object', namedParams: [parameter(name: 'jsonyaml', type: 'java.lang.String'), parameter(name: 'apiURL', type: 'java.lang.String'), parameter(name: 'authToken', type: 'java.lang.String'), parameter(name: 'namespace', type: 'java.lang.String'), parameter(name: 'verbose', type: 'java.lang.String'), ], doc: 'Create OpenShift Resource(s)')
method(name: 'openshiftDeleteResourceByJsonYaml', type: 'Object', params: [jsonyaml:'java.lang.String'], doc: 'Delete OpenShift Resource(s) from JSON/YAML')
method(name: 'openshiftDeleteResourceByJsonYaml', type: 'Object', namedParams: [parameter(name: 'jsonyaml', type: 'java.lang.String'), parameter(name: 'apiURL', type: 'java.lang.String'), parameter(name: 'authToken', type: 'java.lang.String'), parameter(name: 'namespace', type: 'java.lang.String'), parameter(name: 'verbose', type: 'java.lang.String'), ], doc: 'Delete OpenShift Resource(s) from JSON/YAML')
method(name: 'openshiftDeleteResourceByKey', type: 'Object', namedParams: [parameter(name: 'types', type: 'java.lang.String'), parameter(name: 'keys', type: 'java.lang.String'), parameter(name: 'apiURL', type: 'java.lang.String'), parameter(name: 'authToken', type: 'java.lang.String'), parameter(name: 'namespace', type: 'java.lang.String'), parameter(name: 'verbose', type: 'java.lang.String'), ], doc: 'Delete OpenShift Resource(s) by Key')
method(name: 'openshiftDeleteResourceByLabels', type: 'Object', namedParams: [parameter(name: 'types', type: 'java.lang.String'), parameter(name: 'keys', type: 'java.lang.String'), parameter(name: 'values', type: 'java.lang.String'), parameter(name: 'apiURL', type: 'java.lang.String'), parameter(name: 'authToken', type: 'java.lang.String'), parameter(name: 'namespace', type: 'java.lang.String'), parameter(name: 'verbose', type: 'java.lang.String'), ], doc: 'Delete OpenShift Resource(s) using Labels')
method(name: 'openshiftDeploy', type: 'Object', params: [depCfg:'java.lang.String'], doc: 'Trigger OpenShift Deployment')
method(name: 'openshiftDeploy', type: 'Object', namedParams: [parameter(name: 'depCfg', type: 'java.lang.String'), parameter(name: 'apiURL', type: 'java.lang.String'), parameter(name: 'authToken', type: 'java.lang.String'), parameter(name: 'namespace', type: 'java.lang.String'), parameter(name: 'verbose', type: 'java.lang.String'), parameter(name: 'waitTime', type: 'java.lang.String'), parameter(name: 'waitUnit', type: 'java.lang.String'), ], doc: 'Trigger OpenShift Deployment')
method(name: 'openshiftExec', type: 'Object', params: [pod:'java.lang.String'], doc: 'OpenShift Exec')
method(name: 'openshiftExec', type: 'Object', namedParams: [parameter(name: 'pod', type: 'java.lang.String'), parameter(name: 'apiURL', type: 'java.lang.String'), parameter(name: 'arguments', type: 'Map'), parameter(name: 'authToken', type: 'java.lang.String'), parameter(name: 'command', type: 'java.lang.String'), parameter(name: 'container', type: 'java.lang.String'), parameter(name: 'namespace', type: 'java.lang.String'), parameter(name: 'verbose', type: 'java.lang.String'), parameter(name: 'waitTime', type: 'java.lang.String'), parameter(name: 'waitUnit', type: 'java.lang.String'), ], doc: 'OpenShift Exec')
method(name: 'openshiftImageStream', type: 'Object', params: [:], doc: 'OpenShift ImageStreams')
method(name: 'openshiftImageStream', type: 'Object', namedParams: [parameter(name: 'name', type: 'java.lang.String'), parameter(name: 'tag', type: 'java.lang.String'), parameter(name: 'namespace', type: 'java.lang.String'), parameter(name: 'apiURL', type: 'java.lang.String'), parameter(name: 'authToken', type: 'java.lang.String'), parameter(name: 'changelog', type: 'boolean'), parameter(name: 'poll', type: 'boolean'), parameter(name: 'verbose', type: 'java.lang.String'), ], doc: 'OpenShift ImageStreams')
method(name: 'openshiftScale', type: 'Object', namedParams: [parameter(name: 'depCfg', type: 'java.lang.String'), parameter(name: 'replicaCount', type: 'java.lang.String'), parameter(name: 'apiURL', type: 'java.lang.String'), parameter(name: 'authToken', type: 'java.lang.String'), parameter(name: 'namespace', type: 'java.lang.String'), parameter(name: 'verbose', type: 'java.lang.String'), parameter(name: 'verifyReplicaCount', type: 'java.lang.String'), parameter(name: 'waitTime', type: 'java.lang.String'), parameter(name: 'waitUnit', type: 'java.lang.String'), ], doc: 'Scale OpenShift Deployment')
method(name: 'openshiftTag', type: 'Object', namedParams: [parameter(name: 'srcStream', type: 'java.lang.String'), parameter(name: 'srcTag', type: 'java.lang.String'), parameter(name: 'destStream', type: 'java.lang.String'), parameter(name: 'destTag', type: 'java.lang.String'), parameter(name: 'alias', type: 'java.lang.String'), parameter(name: 'apiURL', type: 'java.lang.String'), parameter(name: 'authToken', type: 'java.lang.String'), parameter(name: 'destinationAuthToken', type: 'java.lang.String'), parameter(name: 'destinationNamespace', type: 'java.lang.String'), parameter(name: 'namespace', type: 'java.lang.String'), parameter(name: 'verbose', type: 'java.lang.String'), ], doc: 'Tag OpenShift Image')
method(name: 'openshiftVerifyBuild', type: 'Object', params: [bldCfg:'java.lang.String'], doc: 'Verify OpenShift Build')
method(name: 'openshiftVerifyBuild', type: 'Object', namedParams: [parameter(name: 'bldCfg', type: 'java.lang.String'), parameter(name: 'apiURL', type: 'java.lang.String'), parameter(name: 'authToken', type: 'java.lang.String'), parameter(name: 'checkForTriggeredDeployments', type: 'java.lang.String'), parameter(name: 'namespace', type: 'java.lang.String'), parameter(name: 'verbose', type: 'java.lang.String'), parameter(name: 'waitTime', type: 'java.lang.String'), parameter(name: 'waitUnit', type: 'java.lang.String'), ], doc: 'Verify OpenShift Build')
method(name: 'openshiftVerifyDeployment', type: 'Object', params: [depCfg:'java.lang.String'], doc: 'Verify OpenShift Deployment')
method(name: 'openshiftVerifyDeployment', type: 'Object', namedParams: [parameter(name: 'depCfg', type: 'java.lang.String'), parameter(name: 'apiURL', type: 'java.lang.String'), parameter(name: 'authToken', type: 'java.lang.String'), parameter(name: 'namespace', type: 'java.lang.String'), parameter(name: 'replicaCount', type: 'java.lang.String'), parameter(name: 'verbose', type: 'java.lang.String'), parameter(name: 'verifyReplicaCount', type: 'java.lang.String'), parameter(name: 'waitTime', type: 'java.lang.String'), parameter(name: 'waitUnit', type: 'java.lang.String'), ], doc: 'Verify OpenShift Deployment')
method(name: 'openshiftVerifyService', type: 'Object', params: [svcName:'java.lang.String'], doc: 'Verify OpenShift Service')
method(name: 'openshiftVerifyService', type: 'Object', namedParams: [parameter(name: 'svcName', type: 'java.lang.String'), parameter(name: 'apiURL', type: 'java.lang.String'), parameter(name: 'authToken', type: 'java.lang.String'), parameter(name: 'namespace', type: 'java.lang.String'), parameter(name: 'retryCount', type: 'java.lang.String'), parameter(name: 'verbose', type: 'java.lang.String'), ], doc: 'Verify OpenShift Service')
method(name: 'powershell', type: 'Object', params: [script:'java.lang.String'], doc: 'PowerShell Script')
method(name: 'powershell', type: 'Object', namedParams: [parameter(name: 'script', type: 'java.lang.String'), parameter(name: 'encoding', type: 'java.lang.String'), parameter(name: 'returnStatus', type: 'boolean'), parameter(name: 'returnStdout', type: 'boolean'), ], doc: 'PowerShell Script')
method(name: 'pwd', type: 'Object', params: [:], doc: 'Determine current directory')
method(name: 'pwd', type: 'Object', namedParams: [parameter(name: 'tmp', type: 'boolean'), ], doc: 'Determine current directory')
method(name: 'readFile', type: 'Object', params: [file:'java.lang.String'], doc: 'Read file from workspace')
method(name: 'readFile', type: 'Object', namedParams: [parameter(name: 'file', type: 'java.lang.String'), parameter(name: 'encoding', type: 'java.lang.String'), ], doc: 'Read file from workspace')
method(name: 'sh', type: 'Object', params: [script:'java.lang.String'], doc: 'Shell Script')
method(name: 'sh', type: 'Object', namedParams: [parameter(name: 'script', type: 'java.lang.String'), parameter(name: 'encoding', type: 'java.lang.String'), parameter(name: 'returnStatus', type: 'boolean'), parameter(name: 'returnStdout', type: 'boolean'), ], doc: 'Shell Script')
method(name: 'stash', type: 'Object', params: [name:'java.lang.String'], doc: 'Stash some files to be used later in the build')
method(name: 'stash', type: 'Object', namedParams: [parameter(name: 'name', type: 'java.lang.String'), parameter(name: 'allowEmpty', type: 'boolean'), parameter(name: 'excludes', type: 'java.lang.String'), parameter(name: 'includes', type: 'java.lang.String'), parameter(name: 'useDefaultExcludes', type: 'boolean'), ], doc: 'Stash some files to be used later in the build')
method(name: 'step', type: 'Object', params: [delegate:'Map'], doc: 'General Build Step')
method(name: 'svn', type: 'Object', params: [url:'java.lang.String'], doc: 'Subversion')
method(name: 'svn', type: 'Object', namedParams: [parameter(name: 'url', type: 'java.lang.String'), parameter(name: 'changelog', type: 'boolean'), parameter(name: 'poll', type: 'boolean'), ], doc: 'Subversion')
method(name: 'tm', type: 'Object', params: [stringWithMacro:'java.lang.String'], doc: 'Expand a string containing macros')
method(name: 'unstash', type: 'Object', params: [name:'java.lang.String'], doc: 'Restore files previously stashed')
method(name: 'validateDeclarativePipeline', type: 'Object', params: [path:'java.lang.String'], doc: 'Validate a file containing a Declarative Pipeline')
method(name: 'wrap', type: 'Object', params: [delegate:Map, body:'Closure'], doc: 'General Build Wrapper')
method(name: 'writeFile', type: 'Object', namedParams: [parameter(name: 'file', type: 'java.lang.String'), parameter(name: 'text', type: 'java.lang.String'), parameter(name: 'encoding', type: 'java.lang.String'), ], doc: 'Write file to workspace')
method(name: '_OcAction', type: 'Object', namedParams: [parameter(name: 'server', type: 'java.lang.String'), parameter(name: 'project', type: 'java.lang.String'), parameter(name: 'verb', type: 'java.lang.String'), parameter(name: 'verbArgs', type: 'java.util.List'), parameter(name: 'userArgs', type: 'java.util.List'), parameter(name: 'options', type: 'java.util.List'), parameter(name: 'verboseOptions', type: 'java.util.List'), parameter(name: 'token', type: 'java.lang.String'), parameter(name: 'streamStdOutToConsolePrefix', type: 'java.lang.String'), parameter(name: 'reference', type: 'Map'), parameter(name: 'logLevel', type: 'int'), ], doc: 'Internal utility function for OpenShift DSL')
method(name: '_OcContextInit', type: 'Object', params: [:], doc: 'Advanced/Deprecated Internal utility function for OpenShift DSL')
method(name: '_OcWatch', type: 'Object', params: [body:Closure], namedParams: [parameter(name: 'server', type: 'java.lang.String'), parameter(name: 'project', type: 'java.lang.String'), parameter(name: 'verb', type: 'java.lang.String'), parameter(name: 'verbArgs', type: 'java.util.List'), parameter(name: 'userArgs', type: 'java.util.List'), parameter(name: 'options', type: 'java.util.List'), parameter(name: 'verboseOptions', type: 'java.util.List'), parameter(name: 'token', type: 'java.lang.String'), parameter(name: 'logLevel', type: 'int'), ], doc: 'Internal utility function for OpenShift DSL')
method(name: 'archive', type: 'Object', params: [includes:'java.lang.String'], doc: 'Advanced/Deprecated Archive artifacts')
method(name: 'archive', type: 'Object', namedParams: [parameter(name: 'includes', type: 'java.lang.String'), parameter(name: 'excludes', type: 'java.lang.String'), ], doc: 'Archive artifacts')
method(name: 'container', type: 'Object', params: [name:java.lang.String, body:'Closure'], doc: 'Advanced/Deprecated Run build steps in a container')
method(name: 'dockerFingerprintFrom', type: 'Object', namedParams: [parameter(name: 'dockerfile', type: 'java.lang.String'), parameter(name: 'image', type: 'java.lang.String'), parameter(name: 'toolName', type: 'java.lang.String'), ], doc: 'Record trace of a Docker image used in FROM')
method(name: 'unarchive', type: 'Object', params: [:], doc: 'Advanced/Deprecated Copy archived artifacts into the workspace')
method(name: 'unarchive', type: 'Object', namedParams: [parameter(name: 'mapping', type: 'Map'), ], doc: 'Copy archived artifacts into the workspace')
method(name: 'withDockerContainer', type: 'Object', params: [image:java.lang.String, body:'Closure'], doc: 'Advanced/Deprecated Run build steps inside a Docker container')
method(name: 'withDockerContainer', type: 'Object', params: [body:Closure], namedParams: [parameter(name: 'image', type: 'java.lang.String'), parameter(name: 'args', type: 'java.lang.String'), parameter(name: 'toolName', type: 'java.lang.String'), ], doc: 'Run build steps inside a Docker container')
method(name: 'withDockerRegistry', type: 'Object', params: [registry:Map, body:'Closure'], doc: 'Advanced/Deprecated Sets up Docker registry endpoint')
method(name: 'withDockerServer', type: 'Object', params: [server:Map, body:'Closure'], doc: 'Advanced/Deprecated Sets up Docker server endpoint')
}
}

// Errors on:
// class org.jenkinsci.plugins.workflow.cps.steps.ParallelStep: There's no @DataBoundConstructor on any constructor of class org.jenkinsci.plugins.workflow.cps.steps.ParallelStep

Status -2 returned by oc client

Hi when I try to run any oc command using the jenkins plugin I get an error like this

ERROR: tag returned an error;
{reference={}, err=, verb=tag, cmd=oc tag pablo-test/petclinic:latest pablodemo-test/petclinic:latest --insecure-skip-tls-verify --server=xxxxxxx --namespace=pablo-test --token=XXXXX , out=, status=-2}

Not sure if it is a proxy issue. If it were I guess it would show some error message. I tried to set th eenv variable http_proxy but nothing.

Unknown command "perform" for "oc" when using DSL method openshift.run()

Working on additional testing for playbook2image and I ran into this error. This was just a test pipeline to determine the return objects for openshift.run().

Versions:
openshift v3.5.5.5
OpenShift Jenkins Client Plugin: 0.9.2

Error

OpenShift Build p2i/createcred-pipeline-1
[Pipeline] node
Running on master in /var/lib/jenkins/jobs/p2i-createcred-pipeline/workspace
[Pipeline] {
[Pipeline] stage
[Pipeline] { (openshift run)
[Pipeline] echo

[Pipeline] _OcContextInit
[Pipeline] _OcContextInit
[Pipeline] readFile
[Pipeline] readFile
[Pipeline] _OcAction
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: perform returned an error;
{reference={}, err=Error: unknown command "perform" for "oc"
Run 'oc --help' for usage., verb=perform, cmd=oc perform test-app-jenkins --image=172.30.10.223:5000/p2i/test-app --restart=Never --env OPTS='-vvv -u 1001 --connection local' --env INVENTORY_FILE=inventory --env PLAYBOOK_FILE=test-playbook.yaml --server=https://172.30.0.1:443 --namespace=p2i --token=XXXXX , out=, status=1}

Finished: FAILURE

BuildConfig w/Pipeline

---
apiVersion: v1
kind: BuildConfig
metadata:
  name: "createcred-pipeline" 
spec:
  strategy:
    type: "JenkinsPipeline"
    jenkinsPipelineStrategy:
        jenkinsfile: |-
            node {
                stage('openshift run') {
                    openshift.withCluster() {
                        openshift.withProject() {
                            def run = openshift.run("test-app-jenkins", 
                                "--image=172.30.10.223:5000/p2i/test-app",
                                "--restart=Never",
                                "--env OPTS='-vvv -u 1001 --connection local'",
                                "--env INVENTORY_FILE=inventory",
                                "--env PLAYBOOK_FILE=test-playbook.yaml" )
                            println(run)
                        }
                    }    
                }
            }

Is the "perform" argument to simplePassthrough correct?

https://github.com/openshift/jenkins-client-plugin/blob/master/src/main/resources/com/openshift/jenkins/plugins/OpenShiftDSL.groovy#L658

Mechanism to get user token

I am currently with a customer (i am a RedHatter part of UK Consulting), who want to use a Jenkins pipeline to deploy certain resources (DCs, secrets, etc) into a project owned by a certain user - our Jenkins instance does not have permissions within their project and it might be in a different cluster.

To deploy those resources, we get the users token, and then use that token as part of the "withCluster()" later on in the pipeline, so as to "impersonate" the user.

The following bit of code works, and gets back a token:

stages {
    stage ('Get token via Plugins') {
        steps {
            script {
                openshift.withCluster("${params.CLUSTER_ID}") {
                    openshift.withProject("${params.PROJECT_NAME}") {
                        openshift.raw("login", "--token='' --username=${params.SYSTEM_ACCOUNT_NAME} --password=${params.SYSTEM_ACCOUNT_PASSWORD}")

                        def token = openshift.raw("whoami", "--token='' -t")
                        echo "token == ${token.out}"
                    }   
                }
            }
        }
    }
}

However, its includes a hack, in that i provide the token as empty, to trick the plugin into not adding it onto the command line.

Would it be possible to either provide a "whoami" method, which would return the token or a Env var to tell the plugin not to set the token, so as that i don't need to provide the empty token.

Or, can the above be achieved in a different way?

I know its probably a bit of a strange use case, but something that more customers might want to do.

Compilation error with latest jenkins image

Using the latest Jenkins image (which was upgraded to use the Pipeline Shared Groovy Libraries Plugin v2.5 (https://wiki.jenkins-ci.org/display/JENKINS/Pipeline+Shared+Groovy+Libraries+Plugin), a compilation exception is thrown when we try to use the client plugin:

hudson.remoting.ProxyException: org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
jar:file:/var/lib/jenkins/plugins/openshift-client/WEB-INF/lib/openshift-client.jar!/com/openshift/jenkins/plugins/OpenShiftDSL.groovy: 38: unable to resolve class java.lang$Enum 
 @ line 38, column 5.
       enum ContextId implements Serializable{
       ^

jar:file:/var/lib/jenkins/plugins/openshift-client/WEB-INF/lib/openshift-client.jar!/com/openshift/jenkins/plugins/OpenShiftDSL.groovy: -1: unable to resolve class java.lang$Enum 
 @ line -1, column -1.
2 errors

	at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:310)
	at org.codehaus.groovy.control.CompilationUnit.applyToSourceUnits(CompilationUnit.java:946)
	at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:593)
	at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:542)
	at groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:298)
	at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:268)
	at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:254)
	at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:250)
	at groovy.lang.GroovyClassLoader.recompile(GroovyClassLoader.java:766)
	at groovy.lang.GroovyClassLoader.loadClass(GroovyClassLoader.java:718)
	at groovy.lang.GroovyClassLoader.loadClass(GroovyClassLoader.java:787)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
	at groovy.lang.GroovyClassLoader.loadClass(GroovyClassLoader.java:677)
	at groovy.lang.GroovyClassLoader$InnerLoader.loadClass(GroovyClassLoader.java:425)
	at groovy.lang.GroovyClassLoader.loadClass(GroovyClassLoader.java:787)
	at groovy.lang.GroovyClassLoader.loadClass(GroovyClassLoader.java:775)
	at com.openshift.jenkins.plugins.pipeline.OpenShiftGlobalVariable.getValue(OpenShiftGlobalVariable.java:34)
	at org.jenkinsci.plugins.workflow.cps.CpsScript.getProperty(CpsScript.java:121)
	at org.codehaus.groovy.runtime.InvokerHelper.getProperty(InvokerHelper.java:172)
	at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.getProperty(ScriptBytecodeAdapter.java:456)
	at org.kohsuke.groovy.sandbox.impl.Checker$4.call(Checker.java:243)
	at org.kohsuke.groovy.sandbox.GroovyInterceptor.onGetProperty(GroovyInterceptor.java:52)
	at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onGetProperty(SandboxInterceptor.java:308)
	at org.kohsuke.groovy.sandbox.impl.Checker$4.call(Checker.java:241)
	at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:238)
	at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:221)
	at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:221)
	at com.cloudbees.groovy.cps.sandbox.SandboxInvoker.getProperty(SandboxInvoker.java:28)
	at com.cloudbees.groovy.cps.impl.PropertyAccessBlock.rawGet(PropertyAccessBlock.java:20)
	at WorkflowScript.run(WorkflowScript:3)
	at ___cps.transform___(Native Method)
	at com.cloudbees.groovy.cps.impl.PropertyishBlock$ContinuationImpl.get(PropertyishBlock.java:74)
	at com.cloudbees.groovy.cps.LValueBlock$GetAdapter.receive(LValueBlock.java:30)
	at com.cloudbees.groovy.cps.impl.PropertyishBlock$ContinuationImpl.fixName(PropertyishBlock.java:66)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.cloudbees.groovy.cps.impl.ContinuationPtr$ContinuationImpl.receive(ContinuationPtr.java:72)
	at com.cloudbees.groovy.cps.impl.ConstantBlock.eval(ConstantBlock.java:21)
	at com.cloudbees.groovy.cps.Next.step(Next.java:74)
	at com.cloudbees.groovy.cps.Continuable.run0(Continuable.java:154)
	at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.access$001(SandboxContinuable.java:18)
	at org.jenkinsci.plugins.workflow.cps.SandboxContinuable$1.call(SandboxContinuable.java:33)
	at org.jenkinsci.plugins.workflow.cps.SandboxContinuable$1.call(SandboxContinuable.java:30)
	at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.GroovySandbox.runInSandbox(GroovySandbox.java:108)
	at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.run0(SandboxContinuable.java:30)
	at org.jenkinsci.plugins.workflow.cps.CpsThread.runNextChunk(CpsThread.java:165)
	at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.run(CpsThreadGroup.java:328)
	at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.access$100(CpsThreadGroup.java:80)
	at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:240)
	at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:228)
	at org.jenkinsci.plugins.workflow.cps.CpsVmExecutorService$2.call(CpsVmExecutorService.java:64)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at hudson.remoting.SingleLaneExecutorService$1.run(SingleLaneExecutorService.java:112)
	at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)
Finished: FAILURE

Auth token not masked when using openshift.verbose()

When running with openshift.verbose() enabled, token stored in jenkins as os-developer "OpenShift Token for OpenShift Client Plugin"-credentials is logged in plain text.

Example pipeline:

pipeline {
    agent any
    parameters {
        string(name: 'PROJECT', description: 'The project to describe resources from.')
    }
    stages {
        stage('Describe resources'){
            steps {
                script{
                    openshift.verbose()
                    openshift.withCluster( ) {
                        openshift.withProject( "${params.PROJECT}" ) {
                            openshift.doAs( 'os-developer' ) {
                                def selector = openshift.selector( 'all' )
                                selector.describe()
                            }
                        }
                    }
                    
                }
            }
        }
    }
}

The command is masked correctly:

Command> oc --server=https://172.30.0.1:443 --namespace=testproj --certificate-authority=/var/run/secrets/kubernetes.io/serviceaccount/ca.crt --loglevel=8 --token=XXXXX describe all

But the token is printed in clear text when http traffic is logged:

...
StdErr> I1221 21:11:49.507844 1094 round_trippers.go:414] GET https://172.30.0.1:443/oapi/v1/namespaces/testproj/buildconfigs?includeUninitialized=true
I1221 21:11:49.507949 1094 round_trippers.go:421] Request Headers:
I1221 21:11:49.507955 1094 round_trippers.go:424] Accept: application/json
I1221 21:11:49.507960 1094 round_trippers.go:424] User-Agent: oc/v1.8.1+0d5291c (linux/amd64) kubernetes/0d5291c
I1221 21:11:49.507963 1094 round_trippers.go:424] Authorization: Bearer hrHoOFyeANcnhVKsXgLYs2o5M4Q-INYf72bTek6Ou_A
I1221 21:11:49.536286 1094 round_trippers.go:439] Response Status: 200 OK in 28 milliseconds
I1221 21:11:49.536326 1094 round_trippers.go:442] Response Headers:
I1221 21:11:49.536332 1094 round_trippers.go:445] Content-Length: 797
...

Plugin forces to install 'oc' client manually

Hi,

I have got Jenkins outside Openshift. I have to install the 'oc' client manually if I want to use this plugin. It would be great if the plugin comes with the 'oc' client to be installed in the Jenkins machine.

Thanks

Stop spamming the pipeline with sub-step information

@jupierce used the quiet listener factory or whatever to stop a bunch of output from being shown inside of the watch -- would be great to get every output of openshift to not spam my pipeline with

[Pipeline] readFile
[Pipeline] readFile
[Pipeline] _OcAction
[Pipeline] readFile
[Pipeline] readFile
[Pipeline] _OcAction

[Pipeline] readFile
[Pipeline] readFile
[Pipeline] _OcAction

/cc @bparees @gabemontero

new oc rsh parially broken

Hello, I tested the code introduced by #81 and ran into an issue

run -it --rm -v $(pwd):/jenkins-plugin -v $(pwd)/../.m2:/root/.m2 -w /jenkins-plugin maven:latest mvn package

Installed the plugin, however some commands still fail:

Pipeline Code:

  echo openshift.rsh(shortname,"ps ax").out

Result:

ERROR: rsh returned an error;
{reference={}, err=error: unknown gnu long option

Usage:
 ps [options]

 Try 'ps --help <simple|list|output|threads|misc|all>'
  or 'ps --help <s|l|o|t|m|a>'
 for additional help text.

For more details see ps(1).
command terminated with exit code 1, verb=rsh, cmd=oc --server=https://xxx.xxx.xxx.xxx:8443/ --namespace=myproject --token=XXXXX rsh centos7-im0t0k1k-1-62s9v ps ax --insecure-skip-tls-verify , out=, status=1}

It seems you you have to move ALL --param params in front of the command.

This works:

Pipeline:

echo openshift.rsh(shortname,"echo").out

Result:

[Pipeline] _OcAction
[Pipeline] echo
--insecure-skip-tls-verify

Proxy configuration

I need to connect to Openshift through a corporate proxy. How can I configure the proxy in this plugin?

Argument in wrong position when executing Selector.rollout().status("-w")

Hi, I tried to run the command Selector.rollout().status("-w") but it looks like the -w argument is placed in a wrong position and the oc command returns an error:

ERROR: rollout:status returned an error;
{reference={}, err=Error: unknown shorthand flag: 'w' in -w

Usage:
oc rollout SUBCOMMAND [options]

Available Commands:
cancel cancel the in-progress deployment
history View rollout history
latest Start a new rollout for a deployment config with the latest state from its triggers
pause Mark the provided resource as paused
resume Resume a paused resource
retry Retry the latest failed rollout
status Show the status of the rollout
undo Undo a previous rollout

Use "oc --help" for more information about a given command.
Use "oc options" for a list of global command-line options (applies to all commands)., verb=rollout, cmd=oc rollout -w status deploymentconfig/jws-app --insecure-skip-tls-verify --server=xxxxxx --namespace=pablodemo-test --token=XXXXX , out=, status=1}

Error when using plugin in custom DSL

I am trying to extract a part of code that I will be reusing in different script. Currently, it is directly in the Jenkinsfile. For example:

stage('Check service directly from Jenkinsfile') {
     steps {
         script {
             openshift.withCluster("${OPENSHIFT_CLUSTER}") {
                 openshift.withProject("${OPENSHIFT_PROJECT}") {
                 def serviceSelector = openshift.selector('service', "${OPENSHIFT_SERVICE}")
                 def serviceDescription = serviceSelector.describe()
                 echo "service: ${serviceDescription}"
             }
         }
     }
 }

This is working as expected.

When extracting code in a groovy method (file vars/getServiceDescription.groovy):

def call(body) {
    def config = [:]
    body.resolveStrategy = Closure.DELEGATE_FIRST
    body.delegate = config
    body

    script {
        openshift.withCluster("${config.cluster}") {
            openshift.withProject("${config.project}") {
                def serviceSelector = openshift.selector('service', "${config.service}")
                def serviceDescription = serviceSelector.describe()
                echo "service: ${serviceDescription}"
            }
        }
    }
}

and calling it from my Jenkinsfile:

stage('Check service from external method') {
    steps {
        getServiceDescription cluster: "${OPENSHIFT_CLUSTER}", project: "${OPENSHIFT_PROJECT}", service: "${OPENSHIFT_SERVICE}"
    }
}

the resulting error/stacktrace is:

java.nio.file.NoSuchFileException: /var/run/secrets/kubernetes.io/serviceaccount/token
	at sun.nio.fs.UnixException.translateToIOException(UnixException.java:86)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
	at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
	at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214)
	at java.nio.file.Files.newByteChannel(Files.java:361)
	at java.nio.file.Files.newByteChannel(Files.java:407)
	at java.nio.file.spi.FileSystemProvider.newInputStream(FileSystemProvider.java:384)
	at java.nio.file.Files.newInputStream(Files.java:152)
	at hudson.FilePath.read(FilePath.java:1771)
	at org.jenkinsci.plugins.workflow.steps.ReadFileStep$Execution.run(ReadFileStep.java:96)
	at org.jenkinsci.plugins.workflow.steps.ReadFileStep$Execution.run(ReadFileStep.java:86)
	at org.jenkinsci.plugins.workflow.steps.SynchronousNonBlockingStepExecution$1$1.call(SynchronousNonBlockingStepExecution.java:49)
	at hudson.security.ACL.impersonate(ACL.java:260)
	at org.jenkinsci.plugins.workflow.steps.SynchronousNonBlockingStepExecution$1.run(SynchronousNonBlockingStepExecution.java:46)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

Any help would be appreciated.

No such property: objs for class: groovy.lang.Binding

Created a pipeline using the migrate example, changing it up for my needs. Most of the examples work, just the part creating objects doesnt appear to be.

openshift.withCluster( '' ) {
  openshift.withProject( 'test-project' ) {
    echo "Hello from a Source project: ${openshift.project()}"
    def maps = openshift.selector( 'deploymentconfig/fun-stuff' )
    def objs = maps.objects( exportable:true )
    def timestamp = "${System.currentTimeMillis()}"
    for ( obj in objs ) {
      obj.metadata.labels[ "promoted-on" ] = timestamp
    }
  }
  openshift.withProject( 'stage' ) {
    echo "Hello from a nDestination project: ${openshift.project()}"
    openshift.create( objs )
    maps.related( 'pods' ).untilEach {
      return it.object().status.phase == 'Running'
    }
  }
}

Got this error in jenkins

OpenShift Build deployment/test-pipeline-1
[Pipeline] echo

[Pipeline] node
Running on master in /var/lib/jenkins/jobs/deployment-test-pipeline/workspace
[Pipeline] {
[Pipeline] _OcContextInit
[Pipeline] _OcContextInit
[Pipeline] echo
Hello from a Source project: test-project
[Pipeline] readFile
[Pipeline] _OcAction
[Pipeline] _OcContextInit
[Pipeline] echo
Hello from a nDestination project: stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
groovy.lang.MissingPropertyException: No such property: objs for class: groovy.lang.Binding
	at groovy.lang.Binding.getVariable(Binding.java:63)
	at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onGetProperty(SandboxInterceptor.java:224)
	at org.kohsuke.groovy.sandbox.impl.Checker$4.call(Checker.java:241)
	at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:238)
	at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:221)
	at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:221)
	at com.cloudbees.groovy.cps.sandbox.SandboxInvoker.getProperty(SandboxInvoker.java:28)
	at com.cloudbees.groovy.cps.impl.PropertyAccessBlock.rawGet(PropertyAccessBlock.java:20)
	at WorkflowScript.run(WorkflowScript:13)
	at com.openshift.jenkins.plugins.OpenShiftDSL.withProject(jar:file:/var/lib/jenkins/plugins/openshift-client/WEB-INF/lib/openshift-client.jar!/com/openshift/jenkins/plugins/OpenShiftDSL.groovy:317)
	at com.openshift.jenkins.plugins.OpenShiftDSL$Context.run(jar:file:/var/lib/jenkins/plugins/openshift-client/WEB-INF/lib/openshift-client.jar!/com/openshift/jenkins/plugins/OpenShiftDSL.groovy:106)
	at com.openshift.jenkins.plugins.OpenShiftDSL.withProject(jar:file:/var/lib/jenkins/plugins/openshift-client/WEB-INF/lib/openshift-client.jar!/com/openshift/jenkins/plugins/OpenShiftDSL.groovy:316)
	at WorkflowScript.run(WorkflowScript:11)
	at com.openshift.jenkins.plugins.OpenShiftDSL.withCluster(jar:file:/var/lib/jenkins/plugins/openshift-client/WEB-INF/lib/openshift-client.jar!/com/openshift/jenkins/plugins/OpenShiftDSL.groovy:305)
	at com.openshift.jenkins.plugins.OpenShiftDSL$Context.run(jar:file:/var/lib/jenkins/plugins/openshift-client/WEB-INF/lib/openshift-client.jar!/com/openshift/jenkins/plugins/OpenShiftDSL.groovy:106)
	at com.openshift.jenkins.plugins.OpenShiftDSL.withCluster(jar:file:/var/lib/jenkins/plugins/openshift-client/WEB-INF/lib/openshift-client.jar!/com/openshift/jenkins/plugins/OpenShiftDSL.groovy:304)
	at com.openshift.jenkins.plugins.OpenShiftDSL.node(jar:file:/var/lib/jenkins/plugins/openshift-client/WEB-INF/lib/openshift-client.jar!/com/openshift/jenkins/plugins/OpenShiftDSL.groovy:1289)
	at ___cps.transform___(Native Method)
	at com.cloudbees.groovy.cps.impl.PropertyishBlock$ContinuationImpl.get(PropertyishBlock.java:74)
	at com.cloudbees.groovy.cps.LValueBlock$GetAdapter.receive(LValueBlock.java:30)
	at com.cloudbees.groovy.cps.impl.PropertyishBlock$ContinuationImpl.fixName(PropertyishBlock.java:66)
	at sun.reflect.GeneratedMethodAccessor257.invoke(Unknown Source)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at com.cloudbees.groovy.cps.impl.ContinuationPtr$ContinuationImpl.receive(ContinuationPtr.java:72)
	at com.cloudbees.groovy.cps.impl.ConstantBlock.eval(ConstantBlock.java:21)
	at com.cloudbees.groovy.cps.Next.step(Next.java:83)
	at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:173)
	at com.cloudbees.groovy.cps.Continuable$1.call(Continuable.java:162)
	at org.codehaus.groovy.runtime.GroovyCategorySupport$ThreadCategoryInfo.use(GroovyCategorySupport.java:122)
	at org.codehaus.groovy.runtime.GroovyCategorySupport.use(GroovyCategorySupport.java:261)
	at com.cloudbees.groovy.cps.Continuable.run0(Continuable.java:162)
	at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.access$001(SandboxContinuable.java:19)
	at org.jenkinsci.plugins.workflow.cps.SandboxContinuable$1.call(SandboxContinuable.java:35)
	at org.jenkinsci.plugins.workflow.cps.SandboxContinuable$1.call(SandboxContinuable.java:32)
	at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.GroovySandbox.runInSandbox(GroovySandbox.java:108)
	at org.jenkinsci.plugins.workflow.cps.SandboxContinuable.run0(SandboxContinuable.java:32)
	at org.jenkinsci.plugins.workflow.cps.CpsThread.runNextChunk(CpsThread.java:174)
	at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.run(CpsThreadGroup.java:330)
	at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.access$100(CpsThreadGroup.java:82)
	at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:242)
	at org.jenkinsci.plugins.workflow.cps.CpsThreadGroup$2.call(CpsThreadGroup.java:230)
	at org.jenkinsci.plugins.workflow.cps.CpsVmExecutorService$2.call(CpsVmExecutorService.java:64)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at hudson.remoting.SingleLaneExecutorService$1.run(SingleLaneExecutorService.java:112)
	at jenkins.util.ContextResettingExecutorService$1.run(ContextResettingExecutorService.java:28)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
	at java.util.concurrent.FutureTask.run(FutureTask.java:266)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:748)
Finished: FAILURE

start build with -F option doesn't print logs

I'm using plugin version 0.9.6 with Jenkins 2.74.
$ oc version
oc v3.6.0+c4dd4cf
kubernetes v1.6.1+5115d708d7
features: Basic-Auth

Server xxxxxxxxxxxxx
openshift v3.6.173.0.5
kubernetes v1.6.1+5115d708d7

When I try to start build from Jenkins using the -F flag using this command
def bc = openshift.selector("bc/jws-app")
bc.startBuild('--follow') or bc.startBuild('-F)

Jenkins waits until build is finished but it does not print the logs in the console. If I start the build from command line
oc start-build bc/jws-app -F

then it works correctly and I can see the logs in the console

Selector does not work with multiple labels

When I have 2 BuildConfigs in Openshift like this:

BuildConfig 1 labels:

ref: master
app: my-app

BuildConfig 2 labels:

ref: master
app: my-second-app

I expect

def buildConfigSelector = openshift.selector('bc', [ref: 'master', app: 'my-third-app'])
if (buildConfigSelector.count() > 0) {
 sh 'echo more than 0'
}
else {
 sh 'echo no more than 0'
}

to print no more than 0. But currently it selects either BuildConfig 1 or BuildConfig 2. I can verify this using

def result = buildConfigSelector.describe()
sh "echo $result"

Describe prints BuildConfig 1 or BuildConfig 2.

Using username/password credentials?

I could not get this to work. The only way I could get authenication to work was to use a token, as per the docs. I am not an expert on authentication in openshift, but to get a token I need to login to get it? Also it expires every 24 hours so its not something static? Anyone got an example of using a username/password? Even better would be to update the docs with an example, or tell us why username/password should not be used. I would add that I added the username/password credentials as type 'Openshift Username Password'.

Document changes

Currently it is difficult to quickly see what has changed between releases. Having a changelog file or documenting changes in the README.md would remedy this without being a burden on development.

confirm proxy capability

@coreydaley - just forwarded an email thread from openshift-sme on this

Quick summary from that thread

  • oc should honor HTTP_PROXY or HTTPS_PROXY (I dove into this a few months back)
  • users should either set that env var via pipeline build strategy env vars, build params with that name, or use of env.HTTP_PROXY=<value> in there scripts
  • (re)confirm in plugin java code that the EnvVar is properly propagated to the durable task plugin task that leads to the fork/exec of oc (OcAction.java)
  • at the moment, I think the maven/slave related stuff if orthogonal, but let's see what we get back from the customer

@bparees fyi

Command and argument disappeared when re-config jenkins project

I create a build step using OpenShift Generic OC Invocation plugin in my jenkins plugin and saved without problems. But the command line options and arguments field DISAPPEARED when i come back to configuration user interface to make some changes.

Kindly help!

Certificate authority is always expected

The plugin either expects that the CA is configured via "Cluster Configuration" (i.e.: via Manage Jenkins page) or, it defaults to the mounted ca.crt under /var/run.

This is fine, if you are using the plugin to talk to the same cluster you are running in or you want to explicitly set the CA. But in some instances, customers will include the signer of the CA into the Jenkins slaves, thus the certificate-authority is not needed.

For example, if my jenkins is running on cluster1, but trying to talk to cluster2, with the current plugin, it would use the following command:

oc get projects --server=https://cluster2.corp:8443 --namespace=dev --certificate-authority=/var/run/secrets/kubernetes.io/serviceaccount/ca.crt --token=XXXXX

Which will fail with a 509x cert issue.

If i manually run the same command, without the certificate-authority specified, the command succeeds.

Is it possible to provide a flag, either by Cluster Config or Env, which can turn off the use of certificate-authority?

FYI: I am a RedHatter (part of UK Consulting @gahealy) on-site with a customer.

Conditional "when" directive is ignored

When using the "when" directive within a declarative pipeline, it gets ignored if the openshift client plugin is used.

In the following pipeline, both Build and Dev stages run. If the openshift.withCluster() block is removed, then only Dev runs.

openshift.withCluster() {
    pipeline {
        agent {
            label 'maven'
        }
        stages {
            stage('Build') {
                when {
                    expression {
                        return false
                    }
                }
                steps {
                    echo "Build"
                }
            }
            stage('Dev') {
                when {
                    expression {
                        return true
                    }
                }
                steps {
                    echo "Dev"
                }
            }
        }
    }
}

Tested with 1.0.2 and Jenkins Jenkins 2.73.3 on OpenShift v3.7.0-rc.0+d076bb5-181.

Failed Stage Indicating Weirdly in Web UI

From this snippet of Jenkinsfile:

    stage("Generate ConfigMaps") {
      steps {
        script {
          openshift.withCluster() {
            // Create the build and run it
            def builderObjects = openshift.process("--filename=jenkins/setup/resources/config-generator-build.yaml")
            openshift.apply(builderObjects)
            def jobBuilder = openshift.selector("buildconfig/configmap-job-builder")
            sh "cp -R jenkins/library/src jenkins/setup/images/config-job-image"
            openshift.startBuild("configmap-job", "--from-dir=./jenkins/setup/images/config-job-image", "--follow")

            // Wait for an imagestream tag for the builder to be available
            def imageStream = openshift.selector("imagestream/configmap-job")
            def jobImageRef = ""
            imageStream.watch {
              is = it.object()
              if ((is.status.tags != null) && (is.status.tags.size() > 0)) {
                jobImageRef = is.status.tags[0].items[0].dockerImageReference
                return true
              }
            }

            openshift.selector("job/configmap-job").delete("--ignore-not-found")

            // Run the job to create config maps
            def configMapJob = openshift.process(
              "--filename=jenkins/setup/resources/config-generator-job.yaml",
              "-p", "JOB_IMAGE=${jobImageRef}",
              "-p", "JENKINS_ADMIN_SA=${env.JENKINS_ADMIN_SA}")
            openshift.apply(configMapJob)
          }
        }
      }
  }

I see in the logs:

apply returned an error;
{
  "reference": {
    "/tmp/apply1725148177158694347.markup": {
      "apiVersion": "v1",
      "kind": "List",
      "items": [
        {
          "metadata": {
            "name": "configmap-job",
            "labels": {
              "template": "configmap-job",
              "job-name": "configmap-job"
            }
          },
          "apiVersion": "batch/v1",
          "kind": "Job",
          "spec": {
            "template": {
              "metadata": {
                "labels": {
                  "job-name": "configmap-job"
                }
              },
              "spec": {
                "dnsPolicy": "ClusterFirst",
                "containers": [
                  {
                    "name": "configmap-setup",
                    "image": "172.30.1.1:5000/jenkins/configmap-job@sha256:ec005904f8f950d128deb6216ae0dd0a2a79bad8d0bcf93164285394787d6930",
                    "imagePullPolicy": "IfNotPresent",
                    "resources": {}
                  }
                ],
                "securityContext": {},
                "restartPolicy": "Never",
                "serviceAccountName": "jenkins-admin"
              }
            },
            "completions": 1,
            "activeDeadlineSeconds": 600,
            "parallelism": 1
          }
        }
      ]
    }
  },
  "err": "Error from server (NotFound): the server could not find the requested resource",
  "verb": "apply",
  "cmd": "oc apply -f /tmp/apply1725148177158694347.markup -o=name --server=https://172.30.0.1:443 --namespace=jenkins --token=XXXXX",
  "out": "",
  "status": "1"
}

But the web UI does not show a failed step:

jenkins_bad

openshift.create() creates file on master in master/slave setup resulting in error

In a master/slave scenario, when using openshift.create() within a node('foo'){} block, a temporary file named /tmp/create<randomness>.markup is created on the master, but the oc create -f <tempfile> is executed on the slave, resulting in an error: file not found.

Sample syntax that causes an issue:

openshift.withCluster() {
  openshift.withProject() {
    node('nodejs') {
      stage('create template') {
            git url: 'https://github.com/openshift/nodejs-ex.git' 
            def template = openshift.create(readFile('openshift/templates/nodejs-mongodb.json')).object()
      }
    }
  }
}

Sample error:
err=error: the path "/tmp/create739820317386924410.markup" does not exist, verb=create, cmd=oc create -f /tmp/create739820317386924410.markup -o=name --server=https://172.30.0.1:443 --namespace=testing --token=XXXXX , out=, status=1}

Potential race condition when using multibranch and newBuild

When using workflow multibranch plugin with newBuild a potential race can occur. If more than one build starts simultaneously the Dockerfile's base image could be added as an ImageStream twice - which will result in an error since the IS already exists.

Example of this issue using oc new-build from cli.

jcallen@jcallen ~                                                                                                                 [10:47:12] 
> $ oc new-build -D $'FROM centos\nRUN yum install -y vim wget curl zsh' --name master; oc new-build -D $'FROM centos\nRUN yum install -y vim wget curl zsh' --name foo
--> Found Docker image a8493f5 (2 weeks old) from Docker Hub for "centos"

    * An image stream will be created as "centos:latest" that will track the source image
    * A Docker build using a predefined Dockerfile will be created
      * The resulting image will be pushed to image stream "master:latest"
      * Every time "centos:latest" changes a new build will be triggered

--> Creating resources with label build=master ...
    imagestream "centos" created
    imagestream "master" created
    buildconfig "master" created
--> Success
    Build configuration "master" created and build triggered.
    Run 'oc logs -f bc/master' to stream the build progress.
--> Found Docker image a8493f5 (2 weeks old) from Docker Hub for "centos"

    * An image stream will be created as "centos:latest" that will track the source image
    * A Docker build using a predefined Dockerfile will be created
      * The resulting image will be pushed to image stream "foo:latest"
      * Every time "centos:latest" changes a new build will be triggered

--> Creating resources with label build=foo ...
    error: imagestreams "centos" already exists
    imagestream "foo" created
    buildconfig "foo" created
--> Failed

An optional resolution of this issue that I have tested is to use Jenkin's lockable resources plugin for example:
https://github.com/jcpowermac/ojalloy/blob/master/vars/newBuildOpenShift.groovy#L25

untilEach: does not work with a static selector?

Here is my Jenkinsfile:

pipeline {
    agent any
    stages {
        stage ('test') {
            steps {
                script {
                    List<String> names = Arrays.asList("pods/release-ci-binary-1-build", "pods/release-ci-binary-2-build", "pods/release-ci-binary-3-build", "pods/release-ci-binary-4-build", "pods/release-ci-binary-6-build")
                    openshift.withCluster() {
                        def selector = openshift.selector(names[0])
                        if (names.size() > 1) {
                            for (String name : names[1..-1]) {
                                selector = selector.union(openshift.selector(name))
                            }
                        }
                        selector.untilEach(1) {
                            echo "Saw ${it.name()}"
                        }
                    }
                }
            }
        }
    }
}

Here is the output (+100000000000 points for the error message):

Started by user stevekuznetsov
[Pipeline] node
Running on master in /var/lib/jenkins/jobs/testing/workspace
[Pipeline] {
[Pipeline] stage
[Pipeline] { (test)
[Pipeline] script
[Pipeline] {
[Pipeline] echo

[Pipeline] _OcContextInit
[Pipeline] readFile
[Pipeline] readFile
[Pipeline] _OcWatch
Entering watch
Running watch closure body
[Pipeline] {
[Pipeline] readFile
[Pipeline] readFile
[Pipeline] _OcAction
[Pipeline] readFile
[Pipeline] readFile
[Pipeline] _OcAction
[Pipeline] echo
Saw pods/release-ci-binary-3-build
[Pipeline] echo
Saw pods/release-ci-binary-6-build
[Pipeline] echo
Saw pods/release-ci-binary-2-build
[Pipeline] echo
Saw pods/release-ci-binary-4-build
[Pipeline] echo
Saw pods/release-ci-binary-1-build
[Pipeline] }
[Pipeline] // _OcWatch
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: watch terminated with an error: 1
Finished: FAILURE

adjust to underlying oc/openshift change in types returned on selectors

When running jenkins-image-sample.groovy, we used to be able to run it multiple times within a jenkins instance.

However, there seems to have been a change in the underlying oc/openshift behavior. I suspect, though have not precisely confirmed, from a quick grep of "deploymentconfigs" in the origin code base that it could be related to the recent API group changes.

When first creating the objs in jenkins-image-sample.groovy from the mongodb-ephemeral template, when iterating over the created objects, DCs are prefaced with the type "deploymentconfig/". And the current jenkins-client-plugin code base works fine.

However, if you run the pipeline again, and we find existing objects (and hence don't create them), when we iterate over the objects from the selector, DCs are prefaced with "deploymentconfigs/" (note the 's' at the end on deploymentconfig), and the abbreviation mapping logic in OpenShiftDSL.groovy does not recognize the type and aborts.

Adjusting the abbreviation mapping from a 1-1 to a 1-many should be straight forward (though it will again tax my already tenuous feelings toward groovy I'm sure).

@bparees fyi

Checking if resource exists

I am having a problem checking if resource exists before i delete it.
It would be good if you could implement method to do tasks like this.

def p = openshift.selector( 'hpa/myautoscaler' ).object()
if(p != null){openshift.selector( 'hpa/myautoscaler' ).delete()}

Right now when i try to execute first command to get the resource object it ends up failing my Jenkins pipeline build.

ERROR: Error during delete;
{reference={}, err=Error from server (NotFound): horizontalpodautoscalers.autoscaling "myautoscaler" not found, verb=delete, cmd=oc delete hpa/myautoscaler --insecure-skip-tls-verify --server=https://my.awsome.cluster --namespace=test --token=XXXXX , out=, status=1}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.