dekorateio / dekorate Goto Github PK
View Code? Open in Web Editor NEWTools for generating Kubernetes related manifests.
License: Apache License 2.0
Tools for generating Kubernetes related manifests.
License: Apache License 2.0
When we generate the resource for a Component
, then the apiVersion included is apiVersion: "v1beta1"
and not the apiVersion
of the CRD which is here component.k8s.io/v1alpha1
---
apiVersion: "v1"
kind: "List"
items:
- apiVersion: "v1beta1"
kind: "Component"
metadata:
annotations: {}
labels: {}
name: ""
spec:
deploymentMode: "innerloop"
runtime: "spring-boot"
exposeService: false
image: []
env:
- name: "key1"
value: "val1"
- name: "key1"
value: "val1"
feature: []
link: []
service: []
There have been talks about the annotation names and how descriptive they are.
For example:
don't quite describe how they are used.
Neither describe nor provide the necessary context.
This issue is about discussing naming styles and alternatives:
It might doesn't describe the intention of generating resources, but then again I don't feel that it has too. sundrio
, immutables
, lobmbok
and other projects that generate code, or perform bytecode manipulation don't use names that reflect that. Instead they use names that try to describe in a way the annotated class.
In the same sprit we could use the spring boot them and also go with:
For the sevice catalog I can't find an equivalent. But it could go like:
which is TOO long IMHO.
or a variation that doesn't have coupling with openshift, servicecatalog etc:
@generate({@KubernetesResouces(...), @ServiceCatalogResource()).
Its not as simple as I would want.
The hooks are feed the resource path incorectly.
This renders the OcBuildHook
completely useless atm.
When we test this class : https://github.com/ap4k/ap4k/blob/issue-20/examples/component-example/src/test/java/io/ap4k/examples/ComponentSpringBootExampleTest.java
then we got this error
io.ap4k.Ap4kException: No resource type found for:v1beta1#Component
at [Source: N/A; line: -1, column: -1] (through reference chain: io.ap4k.deps.kubernetes.api.model.KubernetesList["items"]->java.util.ArrayList[0])
at io.ap4k.Ap4kException.launderThrowable(Ap4kException.java:26)
at io.ap4k.Ap4kException.launderThrowable(Ap4kException.java:16)
at io.ap4k.utils.Serialization.unmarshal(Serialization.java:97)
at io.ap4k.utils.Serialization.unmarshal(Serialization.java:72)
at io.ap4k.utils.Serialization.unmarshal(Serialization.java:61)
at io.ap4k.examples.ComponentSpringBootExampleTest.shouldContainComponent(ComponentSpringBootExampleTest.java:14)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:436)
at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:115)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$6(TestMethodTestDescriptor.java:170)
at org.junit.jupiter.engine.execution.ThrowableCollector.execute(ThrowableCollector.java:40)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:166)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:113)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:58)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor$NodeExecutor.lambda$executeRecursively$3(HierarchicalTestExecutor.java:112)
at org.junit.platform.engine.support.hierarchical.SingleTestExecutor.executeSafely(SingleTestExecutor.java:66)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor$NodeExecutor.executeRecursively(HierarchicalTestExecutor.java:108)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor$NodeExecutor.execute(HierarchicalTestExecutor.java:79)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor$NodeExecutor.lambda$executeRecursively$2(HierarchicalTestExecutor.java:120)
at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
at java.util.Iterator.forEachRemaining(Iterator.java:116)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor$NodeExecutor.lambda$executeRecursively$3(HierarchicalTestExecutor.java:120)
at org.junit.platform.engine.support.hierarchical.SingleTestExecutor.executeSafely(SingleTestExecutor.java:66)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor$NodeExecutor.executeRecursively(HierarchicalTestExecutor.java:108)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor$NodeExecutor.execute(HierarchicalTestExecutor.java:79)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor$NodeExecutor.lambda$executeRecursively$2(HierarchicalTestExecutor.java:120)
at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
at java.util.Iterator.forEachRemaining(Iterator.java:116)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor$NodeExecutor.lambda$executeRecursively$3(HierarchicalTestExecutor.java:120)
at org.junit.platform.engine.support.hierarchical.SingleTestExecutor.executeSafely(SingleTestExecutor.java:66)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor$NodeExecutor.executeRecursively(HierarchicalTestExecutor.java:108)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor$NodeExecutor.execute(HierarchicalTestExecutor.java:79)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:55)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:43)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:170)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:154)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:90)
at com.intellij.junit5.JUnit5IdeaTestRunner.startRunnerWithArgs(JUnit5IdeaTestRunner.java:74)
at com.intellij.rt.execution.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:47)
at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:242)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:70)
Caused by: io.ap4k.deps.jackson.databind.JsonMappingException: No resource type found for:v1beta1#Component
at [Source: N/A; line: -1, column: -1] (through reference chain: io.ap4k.deps.kubernetes.api.model.KubernetesList["items"]->java.util.ArrayList[0])
at io.ap4k.deps.jackson.databind.JsonMappingException.from(JsonMappingException.java:255)
at io.ap4k.deps.jackson.databind.DeserializationContext.mappingException(DeserializationContext.java:982)
at io.ap4k.deps.kubernetes.internal.KubernetesDeserializer.deserialize(KubernetesDeserializer.java:78)
at io.ap4k.deps.kubernetes.internal.KubernetesDeserializer.deserialize(KubernetesDeserializer.java:32)
at io.ap4k.deps.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:277)
at io.ap4k.deps.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:249)
at io.ap4k.deps.jackson.databind.deser.std.CollectionDeserializer.deserialize(CollectionDeserializer.java:26)
at io.ap4k.deps.jackson.databind.deser.SettableBeanProperty.deserialize(SettableBeanProperty.java:490)
at io.ap4k.deps.jackson.databind.deser.impl.MethodProperty.deserializeAndSet(MethodProperty.java:95)
at io.ap4k.deps.jackson.databind.deser.BeanDeserializer.vanillaDeserialize(BeanDeserializer.java:260)
at io.ap4k.deps.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:125)
at io.ap4k.deps.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:3779)
at io.ap4k.deps.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2050)
at io.ap4k.deps.jackson.databind.ObjectMapper.treeToValue(ObjectMapper.java:2547)
at io.ap4k.deps.kubernetes.internal.KubernetesDeserializer.deserialize(KubernetesDeserializer.java:80)
at io.ap4k.deps.kubernetes.internal.KubernetesDeserializer.deserialize(KubernetesDeserializer.java:32)
at io.ap4k.deps.jackson.databind.ObjectReader._bindAndClose(ObjectReader.java:1578)
at io.ap4k.deps.jackson.databind.ObjectReader.readValue(ObjectReader.java:1166)
at io.ap4k.utils.Serialization.unmarshal(Serialization.java:95)
Currently, the openshift annotation processor is triggered by both @KubernetesAplicaton
and @OpenshiftApplication
annotations.
If @KubernetesApplication
is used without @OpenshiftApplication
and openshift-annotations
is in the classpath, then compilation will fail with an NPE.
The following error is generated when we compile a maven module where there is no maven pom parent
Caused by: java.lang.IllegalStateException: Could not read child element: parent
at io.ap4k.project.MavenInfo.lambda$getElement$6 (MavenInfo.java:145)
at java.util.Optional.orElseThrow (Optional.java:290)
at io.ap4k.project.MavenInfo.getElement (MavenInfo.java:145)
at io.ap4k.project.MavenInfo.getParentVersion (MavenInfo.java:109)
at io.ap4k.project.MavenInfo.getVersion (MavenInfo.java:89)
at io.ap4k.project.MavenInfo.<init> (MavenInfo.java:45)
at io.ap4k.project.MavenInfoReader.getInfo (MavenInfoReader.java:45)
at io.ap4k.project.MavenInfoReader.getInfo (MavenInfoReader.java:29)
at io.ap4k.project.FileProjectFactory.lambda$getProjectInfo$2 (FileProjectFactory.java:78)
at java.util.Optional.map (Optional.java:215)
at io.ap4k.project.FileProjectFactory.getProjectInfo (FileProjectFactory.java:78)
at io.ap4k.project.FileProjectFactory.createInternal (FileProjectFactory.java:59)
at io.ap4k.project.FileProjectFactory.create (FileProjectFactory.java:48)
at io.ap4k.project.AptProjectFactory.createInternal (AptProjectFactory.java:52)
at io.ap4k.project.AptProjectFactory.create (AptProjectFactory.java:42)
at io.ap4k.processor.AbstractAnnotationProcessor.init (AbstractAnnotationProcessor.java:54)
This error is reported if I change the version of the ap4k deps to use 1.0-SNAPSHOT
and execute mvn clean install
[INFO] --- maven-install-plugin:2.4:install (default-install) @ ap4k-micronaut ---
[INFO] Installing /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/frameworks/micronaut/target/ap4k-micronaut-1.0-SNAPSHOT.jar to /Users/dabou/.m2/repository/io/ap4k/ap4k-micronaut/1.0-SNAPSHOT/ap4k-micronaut-1.0-SNAPSHOT.jar
[INFO] Installing /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/frameworks/micronaut/pom.xml to /Users/dabou/.m2/repository/io/ap4k/ap4k-micronaut/1.0-SNAPSHOT/ap4k-micronaut-1.0-SNAPSHOT.pom
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] AP4K 1.0-SNAPSHOT .................................. SUCCESS [ 0.306 s]
[INFO] AP4K :: Core ....................................... SUCCESS [ 13.281 s]
[INFO] AP4K :: Annotations ................................ SUCCESS [ 0.011 s]
[INFO] AP4K :: Annotations :: Kubernetes .................. SUCCESS [ 3.253 s]
[INFO] AP4K :: Annotations :: Openshift ................... SUCCESS [ 7.634 s]
[INFO] AP4K :: Annotations :: Istio ....................... SUCCESS [ 4.190 s]
[INFO] AP4K :: Annotations :: Service Catalog ............. SUCCESS [ 4.754 s]
[INFO] AP4K :: Annotations :: Component Operator .......... SUCCESS [ 6.043 s]
[INFO] AP4K :: Frameworks ................................. SUCCESS [ 0.013 s]
[INFO] AP4K :: Frameworks :: Spring Boot .................. SUCCESS [ 1.701 s]
[INFO] AP4K :: Examples ................................... SUCCESS [ 0.008 s]
[INFO] AP4K :: Examples :: Service Catalog ................ SUCCESS [ 1.211 s]
[INFO] AP4K :: Examples :: Kubernetes ..................... SUCCESS [ 2.825 s]
[INFO] AP4K :: Examples :: Openshift ...................... SUCCESS [ 3.932 s]
[INFO] AP4K :: Examples :: Istio .......................... SUCCESS [ 1.298 s]
[INFO] AP4K :: Examples :: Spring Boot on Kubernetes ...... SUCCESS [ 2.922 s]
[INFO] AP4K :: Examples :: Spring Boot on Openshift ....... SUCCESS [ 3.164 s]
[INFO] AP4K :: Examples :: Component Operator ............. SUCCESS [ 0.856 s]
[INFO] AP4K :: Testing .................................... SUCCESS [ 0.007 s]
[INFO] AP4K :: Testing :: Openshift ....................... SUCCESS [ 0.890 s]
[INFO] AP4K :: Examples :: Source to Image Example ........ SUCCESS [ 54.237 s]
[INFO] AP4K :: Frameworks :: Thorntail .................... SUCCESS [ 0.557 s]
[INFO] AP4K :: Frameworks :: Micronaut 1.0-SNAPSHOT ....... SUCCESS [ 0.489 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:53 min
[INFO] Finished at: 2018-11-21T18:18:12+01:00
[INFO] ------------------------------------------------------------------------
Applied: s2i-java
Applied: source-to-image-example
Applied: source-to-image-example
Applied: source-to-image-example
Exception in thread "Thread-35" java.lang.NoClassDefFoundError: Lio/ap4k/deps/openshift/api/model/BuildSpec;
at java.lang.Class.getDeclaredFields0(Native Method)
at java.lang.Class.privateGetDeclaredFields(Class.java:2583)
at java.lang.Class.getDeclaredFields(Class.java:1916)
at io.ap4k.deps.jackson.databind.util.ClassUtil$ClassMetadata.getDeclaredFields(ClassUtil.java:1087)
at io.ap4k.deps.jackson.databind.util.ClassUtil.getDeclaredFields(ClassUtil.java:386)
at io.ap4k.deps.jackson.databind.introspect.AnnotatedClass._findFields(AnnotatedClass.java:809)
at io.ap4k.deps.jackson.databind.introspect.AnnotatedClass.resolveFields(AnnotatedClass.java:575)
at io.ap4k.deps.jackson.databind.introspect.AnnotatedClass.fields(AnnotatedClass.java:353)
at io.ap4k.deps.jackson.databind.introspect.POJOPropertiesCollector._addFields(POJOPropertiesCollector.java:350)
at io.ap4k.deps.jackson.databind.introspect.POJOPropertiesCollector.collectAll(POJOPropertiesCollector.java:283)
at io.ap4k.deps.jackson.databind.introspect.POJOPropertiesCollector.getPropertyMap(POJOPropertiesCollector.java:248)
at io.ap4k.deps.jackson.databind.introspect.POJOPropertiesCollector.getProperties(POJOPropertiesCollector.java:155)
at io.ap4k.deps.jackson.databind.introspect.BasicBeanDescription._properties(BasicBeanDescription.java:142)
at io.ap4k.deps.jackson.databind.introspect.BasicBeanDescription.findProperties(BasicBeanDescription.java:217)
at io.ap4k.deps.jackson.databind.deser.BasicDeserializerFactory._findCreatorsFromProperties(BasicDeserializerFactory.java:330)
at io.ap4k.deps.jackson.databind.deser.BasicDeserializerFactory._constructDefaultValueInstantiator(BasicDeserializerFactory.java:312)
at io.ap4k.deps.jackson.databind.deser.BasicDeserializerFactory.findValueInstantiator(BasicDeserializerFactory.java:252)
at io.ap4k.deps.jackson.databind.deser.BeanDeserializerFactory.buildBeanDeserializer(BeanDeserializerFactory.java:221)
at io.ap4k.deps.jackson.databind.deser.BeanDeserializerFactory.createBeanDeserializer(BeanDeserializerFactory.java:143)
at io.ap4k.deps.jackson.databind.deser.DeserializerCache._createDeserializer2(DeserializerCache.java:406)
at io.ap4k.deps.jackson.databind.deser.DeserializerCache._createDeserializer(DeserializerCache.java:352)
at io.ap4k.deps.jackson.databind.deser.DeserializerCache._createAndCache2(DeserializerCache.java:264)
at io.ap4k.deps.jackson.databind.deser.DeserializerCache._createAndCacheValueDeserializer(DeserializerCache.java:244)
at io.ap4k.deps.jackson.databind.deser.DeserializerCache.findValueDeserializer(DeserializerCache.java:142)
at io.ap4k.deps.jackson.databind.DeserializationContext.findRootValueDeserializer(DeserializationContext.java:477)
at io.ap4k.deps.jackson.databind.ObjectMapper._findRootDeserializer(ObjectMapper.java:3908)
at io.ap4k.deps.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:3803)
at io.ap4k.deps.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2874)
at io.ap4k.deps.kubernetes.client.utils.Serialization.unmarshal(Serialization.java:235)
at io.ap4k.deps.kubernetes.client.utils.Serialization.unmarshal(Serialization.java:190)
at io.ap4k.deps.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:383)
at io.ap4k.deps.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:360)
at io.ap4k.deps.openshift.client.dsl.internal.BuildConfigOperationsImpl.fromInputStream(BuildConfigOperationsImpl.java:274)
at io.ap4k.deps.openshift.client.dsl.internal.BuildConfigOperationsImpl.fromFile(BuildConfigOperationsImpl.java:231)
at io.ap4k.deps.openshift.client.dsl.internal.BuildConfigOperationsImpl.fromFile(BuildConfigOperationsImpl.java:68)
at io.ap4k.openshift.hook.JavaBuildHook.run(JavaBuildHook.java:70)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.ClassNotFoundException: io.ap4k.deps.openshift.api.model.BuildSpec
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 37 more
Do mvn compile
of the component-example project
package io.ap4k.examples.component;
import io.ap4k.annotation.Env;
import io.ap4k.annotation.KubernetesApplication;
import io.ap4k.component.annotation.CompositeApplication;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@KubernetesApplication
@CompositeApplication(envVars = @Env(name = "key1", value = "val1"))
@SpringBootApplication
public class Main {
public static void main(String[] args) {
SpringApplication.run(Main.class, args);
}
}
As you can see the file generated contains the envVar but runtime's field is empty
---
apiVersion: "v1"
kind: "List"
items:
- apiVersion: "v1beta1"
kind: "Component"
metadata:
annotations: {}
labels: {}
name: "component-example"
spec:
deploymentMode: "innerloop"
exposeService: false
image: []
env:
- name: "key1"
value: "val1"
- name: "key1"
value: "val1"
feature: []
link: []
service: []
Currently, in order to use prometheus with spring boot, the user has to manually set:
management.endpoints.enabled-by-default=true
management.endpoints.web.exposure.include=health,info,metrics,prometheus
IMHO, addding @EnableServiceMonitor
on top the main class is good enough to express the intention to expose the prometheus endpoint. And thus, the manual configuration should be made optional.
So, what ways do we have in order to pass this configuration to the application?
override
properties and mount it to the application pod.While option 3 seems to be the simplest, I want to keep the project as decoupled as possible from apt.
Is @CompositeApplication
the most appropriate wording to be used ? To be honest, I'm not sure.
We can keep it for the moment but the information that it includes is next used by the controller/operator deployed on openshift to generate the following resources which are mandatory for every microservices deployment : DeploymentConfig, Service, Route (optional)
A better name should be @ComponentAnnotation
or even better @MicroserviceAnnotation
WDYT @iocanel ?
<!-- To generate CRD -->
<dependency>
<groupId>io.ap4k</groupId>
<artifactId>ap4k-core</artifactId>
<version>1.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>io.ap4k</groupId>
<artifactId>component-annotations</artifactId>
<version>1.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>io.ap4k</groupId>
<artifactId>servicecatalog-annotations</artifactId>
<version>1.0-SNAPSHOT</version>
</dependency>
<dependency>
<groupId>io.ap4k</groupId>
<artifactId>ap4k-spring-boot</artifactId>
<version>1.0-SNAPSHOT</version>
</dependency>
The following java class annotated using @ServiceCatalog, @ServiceInstances and @Parameter
doesn' t include within the generated yml file the parameters
package com.example.demo;
import io.ap4k.component.annotation.CompositeApplication;
import io.ap4k.kubernetes.annotation.Env;
import io.ap4k.servicecatalog.annotation.Parameter;
import io.ap4k.servicecatalog.annotation.ServiceCatalog;
import io.ap4k.servicecatalog.annotation.ServiceCatalogInstance;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
@CompositeApplication(
exposeService = true,
envVars = @Env(name = "SPRING_PROFILES_ACTIVE", value = "openshift-catalog"))
@ServiceCatalog(
instances = @ServiceCatalogInstance(
name = "postgresql-db",
serviceClass = "dh-postgresql-apb",
servicePlan = "dev",
bindingSecret = "postgresql-db",
parameters = {
@Parameter(key = "postgresql_user", value = "luke"),
@Parameter(key = "postgresql_password", value = "secret"),
@Parameter(key = "postgresql_database", value = "my_data"),
@Parameter(key = "postgresql_version", value = "9.6")
}
)
)
---
apiVersion: "v1"
kind: "List"
items:
- apiVersion: "component.k8s.io/v1alpha1"
kind: "Component"
metadata:
annotations: {}
labels: {}
name: ""
spec:
deploymentMode: "innerloop"
runtime: "spring-boot"
exposeService: true
image: []
env:
- name: "SPRING_PROFILES_ACTIVE"
value: "openshift-catalog"
- name: "SPRING_PROFILES_ACTIVE"
value: "openshift-catalog"
feature: []
link: []
service:
- name: "postgresql-db"
class: "dh-postgresql-apb"
plan: "dev"
secretName: "postgresql-db"
parameters: []
Currently, an array of Link
, Service
, Env
is generated using as yaml/json word the singular form
env:
- name: "SPRING_PROFILES_ACTIVE"
value: "openshift-catalog"
link:
- kind: "Secret"
name: "Secret to be injected as EnvVar using Service's secret"
targetComponentName: "fruit-backend-sb"
ref: "postgresql-db"
service:
- name: "postgresql-db"
class: "dh-postgresql-apb"
plan: "dev"
secretName: "postgresql-db"
parameters:
I propose as this is the case for most of the k8s arrays (and also supported by the component operator) that we use the plural form to name the links
, services
, envs
's arrays.
See Snowdrop Security K8s example : https://github.com/snowdrop/spring-boot-http-secured-booster/blob/master/.openshiftio/service.sso.yaml
Then, the generated component should be
---
apiVersion: "v1"
kind: "List"
items:
- apiVersion: "component.k8s.io/v1alpha1"
kind: "Component"
metadata:
name: "fruit-backend-sb"
spec:
deploymentMode: "innerloop"
runtime: "spring-boot"
exposeService: true
envs:
- name: "SPRING_PROFILES_ACTIVE"
value: "openshift-catalog"
links:
- kind: "Secret"
name: "Secret to be injected as EnvVar using Service's secret"
targetComponentName: "fruit-backend-sb"
ref: "postgresql-db"
services:
- name: "postgresql-db"
class: "dh-postgresql-apb"
plan: "dev"
secretName: "postgresql-db"
parameters:
- name: "postgresql_user"
value: "luke"
- name: "postgresql_password"
value: "secret"
- name: "postgresql_database"
value: "my_data"
- name: "postgresql_version"
value: "9.6"
We need to make sure that when passing to a buid parameters like -Dap4k.build=true
and -Dap4k.deploy=true
that things don't break.
See #112
If @EnableDockerBuild
or EnableS2iBuild
are not present the tests need to be skipped, as they are when a cluster is not present.
Currently, we heavily rely on visitor to perform any sort of config and model updates.
This is powerful, but if not carefull its possible to register a visitor twice leading to dublicate entries in the generated resources.
We need to prevent duplicates but also have a better way to track which visitor is registered from where.
Add a new Link
annotation to be used to by the Composite
Annotation
public @interface Link {
String name();
String targetcomponentname();
String kind();
Env[] envVars() default {};
}
Currently, ap4k doesn't alwyas check which is the target deployement or the target container. So there might be cases, where an option is eagerly applied to more deployments/containers than the desired (this is possible when editing external resources or by using sidecars or init containers).
We need to make sure that each decorator is only applied where it needs to.
As the different annotations will generate K8s or Custom K8s resources, it could be interesting that we support also pipeline / steps
pattern able to install different resources or to perform different actions.
Such a pattern is used by example by the syndesis, camel-k project or even by the Component operator
Benefits :
https://github.com/snowdrop/component-operator/blob/master/pkg/pipeline/step.go#L26
// Action --
type Step interface {
// a user friendly name for the action
Name() string
// returns true if the action can handle the integration
CanHandle(component *v1alpha1.Component) bool
// executes the handling function
Handle(component *v1alpha1.Component, client *client.Client, namespace string) error
}
https://github.com/snowdrop/component-operator/blob/master/pkg/controller/component/handler.go#L74-L83
innerLoopSteps: []pipeline.Step{
innerloop.NewInstallStep(),
},
serviceCatalogSteps: []pipeline.Step{
servicecatalog.NewServiceInstanceStep(),
},
linkSteps: []pipeline.Step{
link.NewLinkStep(),
},
}
WDYT @iocanel ?
I have discovered why we don't get the runtime's value within the component.yml
file generated. This is because the config.getAttribute("RUNTIME_TYPE")
returns null/empty
group, name and version should not be repeated in @EnableDockerBuild
as they are already included in @KubernetesApplication
and @OpenshiftApplication
.
Project can't be compiled as ap4k deps has not been released for 1.0.2
[ERROR] Failed to execute goal on project ap4k-core: Could not resolve dependencies for project io.ap4k:ap4k-core:jar:1.0-SNAPSHOT: Failure to find io.ap4k:ap4k-dependencies:jar:1.0.2 in http://repo1.maven.org/maven2/ was cached in the local repository, resolution will not be reattempted until the update interval of Maven Central has elapsed or updates are forced -> [Help 1]
[ERROR]
Gradle and possibly other tools, have special treatment for artifacts containing annoation processors (e.g. exclude them from the compile path). To reason with gradle its required that the annotation and the processor are in different artifacts and that the processor is explicitly defined as processor
.
The best way to achieve this without introducing too much noise is to use shade plugin to split the artifacts and add them to the reactor with different classifiers.
I must add the @KubernetesApplication
in order to generate the resource files for the annotation @CompositeAnnotation
. Can we avoid that @iocanel ?
Why do you generate a list of k8s's items for the Component @iocanel and not a Component object ?
---
apiVersion: "v1"
kind: "List"
items:
- apiVersion: "v1beta1"
kind: "Component"
metadata:
annotations: {}
labels: {}
name: "fruit-client-sb"
spec:
deploymentMode: "innerloop"
exposeService: false
image: []
env: []
feature: []
link: []
service: []
Can we avoid to include within the generated resource files the empty fields ?
E.g : image: []
,labels: {}
Can you add the missing steps to be able to create an application containing buildconfig
, imageStream
in order next to play this command and example - https://goo.gl/RgVDby ?
Move the @link annotation to a separate maven module. Why : As a link represents a relation which exists virtually between a component and an endpoint or service, then it should be used as standalone Annotation like the ServiceCatalog to describe such METADATA to be injected within the Component target (= DeploymentConfig)
Rename component-operator-annotations
to component-annotations
like component-operator-examples
to component-operator-examples
At the moment we are using two kind of visitors:
We should use a different interface, package etc for each kind and the api should protect us from using the wrong kind in the wrong place.
This would possibly allow external tools to better integrate with ap4k. See: fabric8io/fabric8-maven-plugin#1483
If we change a field of the ComponentSpec class
package io.ap4k.component.model;
import io.ap4k.deps.jackson.annotation.JsonInclude;
import io.ap4k.deps.jackson.annotation.JsonPropertyOrder;
import io.ap4k.deps.kubernetes.api.model.Doneable;
import io.sundr.builder.annotations.Buildable;
import io.sundr.builder.annotations.Inline;
import javax.annotation.Generated;
/**
*
*
*/
@JsonInclude(JsonInclude.Include.NON_NULL)
@Generated("org.jsonschema2pojo")
@JsonPropertyOrder({
"name",
"type",
"packagingMode",
"deploymentMode",
"runtime",
"version",
"exposeService",
"cpu",
"strorage",
"image",
"env",
"feature",
"link"
})
@Buildable(editableEnabled = false, validationEnabled = false, generateBuilderPackage = false, builderPackage = "io.ap4k.deps.kubernetes.api.builder", inline = @Inline(type = Doneable.class, prefix = "Doneable", value = "done"))
public class ComponentSpec {
private String name;
private String packagingMode;
private String type;
private DeploymentType deployment; // deploymentMode -> deployment
And next we re-compile -> mvn clean compile
, then we get such errors
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2.619 s
[INFO] Finished at: 2018-11-28T10:29:50+01:00
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:2.3.2:compile (default-compile) on project component-annotations: Compilation failure: Compilation failure:
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/decorator/AddServiceInstanceToComponent.java:[19,30] error: cannot find symbol
[ERROR] package io.ap4k.component.model
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/decorator/AddServiceInstanceToComponent.java:[27,61] error: cannot find symbol
[ERROR] class ComponentSpecBuilder
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/decorator/AddServiceInstanceToComponent.java:[36,20] error: cannot find symbol
[ERROR] class AddServiceInstanceToComponent
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/decorator/AddServiceInstanceToComponent.java:[49,29] error: cannot find symbol
[ERROR] class AddServiceInstanceToComponent
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/decorator/AddRuntimeToComponent.java:[19,30] error: cannot find symbol
[ERROR] package io.ap4k.component.model
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/decorator/AddRuntimeToComponent.java:[24,53] error: cannot find symbol
[ERROR] class ComponentSpecBuilder
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/decorator/AddRuntimeToComponent.java:[33,20] error: cannot find symbol
[ERROR] class AddRuntimeToComponent
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/processor/CompositeAnnotationProcessor.java:[23,30] error: package io.ap4k.component.adapt does not exist
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/processor/CompositeAnnotationProcessor.java:[25,31] error: package io.ap4k.component.config does not exist
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/processor/CompositeAnnotationProcessor.java:[26,31] error: package io.ap4k.component.config does not exist
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/processor/CompositeAnnotationProcessor.java:[39,78] error: cannot find symbol
[ERROR] class CompositeConfig
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/processor/CompositeAnnotationProcessor.java:[57,31] error: cannot find symbol
[ERROR] class CompositeAnnotationProcessor
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/decorator/AddEnvToComponent.java:[19,30] error: cannot find symbol
[ERROR] package io.ap4k.component.model
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/decorator/AddEnvToComponent.java:[25,49] error: cannot find symbol
[ERROR] class ComponentSpecBuilder
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/decorator/AddEnvToComponent.java:[34,20] error: cannot find symbol
[ERROR] class AddEnvToComponent
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/ComponentHandler.java:[21,31] error: package io.ap4k.component.config does not exist
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/ComponentHandler.java:[22,31] error: package io.ap4k.component.config does not exist
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/ComponentHandler.java:[26,30] error: cannot find symbol
[ERROR] package io.ap4k.component.model
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/ComponentHandler.java:[33,49] error: cannot find symbol
[ERROR] class CompositeConfig
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/ComponentHandler.java:[49,21] error: cannot find symbol
[ERROR] class ComponentHandler
[ERROR] /Users/dabou/Code/snowdrop/ap4k/ap4k-annotations/annotations/component-annotations/src/main/java/io/ap4k/component/ComponentHandler.java:[75,36] error: cannot find symbol
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
This shouldn't have happend! No resource should be generated unless a matching annotation is found.
User provided or external tool provided resources should be picked up from ap4k and customized based on the annotations found.
For example a user should be able to generate resources using fmp fabric8:resources
and have them customized using ap4k annotations.
@SpringBootApplication
@CompositeApplication(
name = "fruit-client-sb",
links = @Link(
name = "Env var to be injected within the target component -> fruit-backend",
targetcomponentname = "fruit-client-sb",
kind = "Env",
ref = "",
envVars = @Env(
name = "OPENSHIFT_ENDPOINT_BACKEND",
value = " http://fruit-backend-sb:8080/api/fruits"
)
))
---
apiVersion: "v1"
kind: "List"
items:
- apiVersion: "component.k8s.io/v1alpha1"
kind: "Component"
metadata:
name: "fruit-client-sb"
spec:
deploymentMode: "innerloop"
runtime: "spring-boot"
exposeService: false
link:
- kind: "Env"
name: "Env var to be injected within the target component -> fruit-backend"
targetComponentName: "fruit-client-sb"
envs:
- name: "OPENSHIFT_ENDPOINT_BACKEND"
value: " http://fruit-backend-sb:8080/api/fruits"
ref: ""
- kind: "Env"
name: "Env var to be injected within the target component -> fruit-backend"
targetComponentName: "fruit-client-sb"
envs:
- name: "OPENSHIFT_ENDPOINT_BACKEND"
value: " http://fruit-backend-sb:8080/api/fruits"
ref: ""
Same issue too using this config (see double Envs and double Vars)
@CompositeApplication(
name = "fruit-backend-sb",
exposeService = true,
envVars = @Env(
name = "SPRING_PROFILES_ACTIVE",
value = "openshift-catalog"),
links = @Link(
name = "Secret to be injected as EnvVar using Service's secret",
targetcomponentname = "fruit-backend-sb",
kind = "Secret",
ref = "postgresql-db"))
@ServiceCatalog(
instances = @ServiceCatalogInstance(
name = "postgresql-db",
serviceClass = "dh-postgresql-apb",
servicePlan = "dev",
bindingSecret = "postgresql-db",
parameters = {
@Parameter(key = "postgresql_user", value = "luke"),
@Parameter(key = "postgresql_password", value = "secret"),
@Parameter(key = "postgresql_database", value = "my_data"),
@Parameter(key = "postgresql_version", value = "9.6")
}
)
)
Result
---
apiVersion: "v1"
kind: "List"
items:
- apiVersion: "component.k8s.io/v1alpha1"
kind: "Component"
metadata:
name: "fruit-backend-sb"
spec:
deploymentMode: "innerloop"
runtime: "spring-boot"
exposeService: true
env:
- name: "SPRING_PROFILES_ACTIVE"
value: "openshift-catalog"
- name: "SPRING_PROFILES_ACTIVE"
value: "openshift-catalog"
link:
- kind: "Secret"
name: "Secret to be injected as EnvVar using Service's secret"
targetComponentName: "fruit-backend-sb"
ref: "postgresql-db"
- kind: "Secret"
name: "Secret to be injected as EnvVar using Service's secret"
targetComponentName: "fruit-backend-sb"
ref: "postgresql-db"
can we add support for generating knative serving services and knative builds ?
The exposeService
spec field value is always equal to false
even if we set to tru within the annotation
@CompositeApplication(name = "hello-spring-boot", exposeService = true)
---
apiVersion: "v1"
kind: "List"
items:
- apiVersion: "v1beta1"
kind: "Component"
metadata:
annotations: {}
labels: {}
name: "hello-spring-boot"
spec:
deploymentMode: "innerloop"
runtime: "spring-boot"
exposeService: false
image: []
env: []
feature: []
link: []
service: []
Component CRD spec support ENV vars but not @CompositeAnnotation
. Even if we specify such info
@CompositeApplication(envVars = @Env(name = "key1", value = "val1"))
the yaml file generated is empty
---
apiVersion: "v1"
kind: "List"
items:
- apiVersion: "v1beta1"
kind: "Component"
metadata:
annotations: {}
labels: {}
name: "component-example"
spec:
deploymentMode: "innerloop"
runtime: "spring-boot"
exposeService: false
image: []
env: []
feature: []
link: []
service: []
During a demo, I realized that I had access to io.fabric8.kubernetes.client*
. This of course shouldn't have happened as we should never depend on any any dependency but compile time ones or shade ones.
As a user, I would like to have the possibility to get a generated resource created as an item and not as as a list
What is currently populated
---
apiVersion: "v1"
kind: "List"
items:
- apiVersion: "v1beta1"
kind: "Component"
metadata:
annotations: {}
labels: {}
name: ""
spec:
deploymentMode: "innerloop"
runtime: "spring-boot"
exposeService: false
image: []
env:
- name: "key1"
value: "val1"
- name: "key1"
value: "val1"
feature: []
link: []
service: []
What I want
apiVersion: component.k8s.io/v1alpha1
kind: "Component"
metadata:
annotations: {}
labels: {}
name: ""
spec:
deploymentMode: "innerloop"
runtime: "spring-boot"
exposeService: false
image: []
env:
- name: "key1"
value: "val1"
feature: []
link: []
service: []
Currently, some things are hard coded for spring (e.g. port number, prometheus path etc).
We could possibly create a CompositePropertySource
that would combine PropertiesPropertySource
with YamlPropertySource
in order to read configuration.
The tricky part here is how to read those files. The annotation processor facilities, do have everything required, but we'll need to keep those abstracted if possible. A similar case is here: https://github.com/ap4k/ap4k/blob/d7c904d18277545cd80df4496b6c6bf91d5cdd5f/annotations/openshift-annotations/src/main/java/io/ap4k/openshift/generator/S2iBuildGenerator.java#L77 (getOutputDirectory() is abstracted in order to avoid leaking apt specific stuff into the generator.)
Note: I am not 100% sold on reusing spring boot own classes. Currently, we have no dependency on spring boot, and reusing these classes would mean that we will introduce one (which I am not really fond of). This needs some thought.
Add a visitor to the ComponentGenerator
in oder to enrich the Component created with the runtime's type (Spring Boot, Eclipse Vert.x, Thorntail, ...).
The runtime field is used by the Operator to pickup the corresponding s2i image to be used to configure the DeploymentConfig/ImageStream accordingly
As a user I want to have the possibility to enable / disable the generation of k8s yaml/json files when customResources are used as in this case, it makes more sense the install the custom resources such as Istio, Component CRD, ...
Hello, I've made a Gradle project with Spring Boot and I've added the gradle dependencies plus the kubernetes Annotations.
I receive this error:
### Invalid relative name: META-INF\ap4k\kubernetes.json
I've also tried with : @GeneratorOptions(outputPath = "myfile") , but same error.
The problem seems to be under:
AbstractAnnotationProcessor ----> protected void write(String group, KubernetesList list) {
....
FileObject json = processingEnv.getFiler().createResource(StandardLocation.CLASS_OUTPUT, PACKAGE, project.getResourceOutputPath() + File.separatorChar + String.format(FILENAME, group, JSON));
It seems that the second argument for createResource() does not accept separators, please try to remove the separator an keep a simple filename maybe.
Thanks.
My app is simple:
@KubernetesApplication
@SpringBootApplication
public class KubernetesDemo2Application {
........
and the Gradle dependencies:
...
dependencies {
implementation('org.springframework.boot:spring-boot-starter-web')
testImplementation('org.springframework.boot:spring-boot-starter-test')
annotationProcessor("io.ap4k:kubernetes-annotations:${ap4kVersion}")
compileOnly("io.ap4k:kubernetes-annotations:${ap4kVersion}")
annotationProcessor("io.ap4k:ap4k-core:${ap4kVersion}")
compileOnly("io.ap4k:ap4k-core:${ap4kVersion}")
// compile("io.ap4k:ap4k-spring-boot:${ap4kVersion}")
}
...
For build I've used Gradle 4.1 and 5.1. Save issue...
Thank you verry much !
The component API operator doesn't like to process a yaml resource containing empty []
E1127 18:10:36.560007 1 reflector.go:205] github.com/snowdrop/component-operator/vendor/sigs.k8s.io/controller-runtime/pkg/cache/internal/informers_map.go:126:
Failed to list *v1alpha1.Component: v1alpha1.ComponentList.Items: []v1alpha1.Component:
v1alpha1.Component.Spec: v1alpha1.ComponentSpec.Link: readObjectStart: expect { or n, but found
[, error found in #10 byte of ...|],"link":[],"runtime|..., bigger context ...|oseService":false,"feature":
[],"image":[],"link":[],"runtime":"spring-boot","service":[]}}],"kind":"|...
---
apiVersion: "v1"
kind: "List"
items:
- apiVersion: "component.k8s.io/v1alpha1"
kind: "Component"
metadata:
labels: {}
name: "hello-app"
spec:
deploymentMode: "innerloop"
runtime: "spring-boot"
exposeService: false
image: []
feature: []
link: []
service: []
I removed manually the []
for the List fields such as Image, features, link, service
spec:
deploymentMode: "innerloop"
runtime: "spring-boot"
exposeService: false
image:
feature:
link:
service:
WDYT @iocanel
Hello Team,
I want to run mvn install under the master branch and it seems that the build is failing, some files from the kubernetes/config path are missing.... maybe under windows, the generated-sources are not generated 100%
ap4k/core/src/main/java/io/ap4k/kubernetes/config/
The missing files are:
import io.ap4k.kubernetes.config.Annotation;
import io.ap4k.kubernetes.config.AwsElasticBlockStoreVolume;
import io.ap4k.kubernetes.config.AzureDiskVolume;
import io.ap4k.kubernetes.config.AzureFileVolume;
import io.ap4k.kubernetes.config.ConfigMapVolume;
import io.ap4k.kubernetes.config.Env;
import io.ap4k.kubernetes.config.KubernetesConfig;
import io.ap4k.kubernetes.config.Label;
import io.ap4k.kubernetes.config.Mount;
import io.ap4k.kubernetes.config.PersistentVolumeClaimVolume;
import io.ap4k.kubernetes.config.Port;
import io.ap4k.kubernetes.config.SecretVolume;
Now, we assume fabric8/s2i but we could use the openshift s2i images for java, that do accept environment variables. The annotation should support these.
See https://docs.openshift.com/online/using_images/s2i_images/java.html
Currently, its possible to define groups that only accpet resource that have been explicitly specified.
For example: We don't want to add resources to component.yml unless explicity specified.
At the moment this doesn't work as expected and if the generators are not registered in the
ideal order the functionality breaks.
From @cmoulliard:"As the process is relatively complex as it will need different steps as described within the sequence diagram, it is very important to be able to log at the end of the process with debug level the following info -> annotations discovered, config created -> configurators applied -> k8s model or custom resource definition model or openshift model ... created -> decorators applied to let the users understand how the rules have been applied"
Contributors and integrators need to have a clear view of how the project works.
If one tries to do something like:
final OpenshiftConfigBuilder openshiftConfigBuilder = OpenshiftConfigAdapter.newBuilder(new HashMap());
System.out.println(openshiftConfigBuilder.build().getGroup());
the result of the print statement will be: "null"
instead of something like a proper null
or an empty string.
This prevents ApplyProjectInfo
from properly applying the project info when the corresponding values haven't been set on OpenshiftApplication
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.