Code Monkey home page Code Monkey logo

distribution's People

Contributors

anoukh avatar anugayan avatar dasuni-30 avatar dependabot-preview[bot] avatar dilini-muthumala avatar dnwick avatar erangatl avatar gimantha avatar gokul avatar gowthamyvaseekaran avatar grainier avatar lafernando avatar lasanthafdo avatar minudika avatar mohanvive avatar moizmali avatar nirandaperera avatar niveathika avatar pcnfernando avatar pradeepajey1 avatar rameshka avatar ramindu90 avatar raveensr avatar senthuran16 avatar sinthuja avatar suhothayan avatar swsachith avatar tishan89 avatar wggihan avatar wso2-jenkins-bot avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

distribution's Issues

Parser does not provide service configurations for TCP sources

Description:
When we deploy Siddhi apps with NATS source using Siddhi operator the service creation does not happen. The reason was parser did not return configurations for the service creation when we have TCP sources. For users, we are recommending to disable ingress creation for TCP sources and create ingress manually. Since this service configuration does not return by the parser lead the user to not only create ingress but also to create K8s service by themselves. Therefore it is better to return TCP ports from the parser.

Suggested Labels:
Feature improvement

Affected Product Version:
5.1.0-alpha

OS, DB, other environment details and versions:
siddhi-operator:0.2.0-alpha

Steps to reproduce:
Deploy Siddhi app like below using Siddhi operator.

@App:name("PowerConsumptionSurgeDetection")
@App:description("App consumes events from HTTP as a JSON message of { 'deviceType': 'dryer', 'power': 6000 } format and inserts the events into DevicePowerStream, and alerts the user if the power consumption in 1 minute is greater than or equal to 10000W by printing a message in the log for every 30 seconds.")

/*
    Input: deviceType string and powerConsuption int(Joules)
    Output: Alert user from printing a log, if there is a power surge in the dryer within 1 minute period. 
            Notify the user in every 30 seconds when total power consumption is greater than or equal to 10000W in 1 minute time period.
*/

@source(
    type='nats',
    cluster.id='siddhi-stan',
    destination = 'PowerStream', 
    bootstrap.servers='nats://siddhi-nats:4222',
    @map(type='text')
)
define stream DevicePowerStream(deviceType string, power int);

@sink(type='log', prefix='LOGGER')
define stream PowerSurgeAlertStream(deviceType string, powerConsumed long);

@info(name='surge-detector')
from DevicePowerStream#window.time(1 min)
select deviceType, sum(power) as powerConsumed
group by deviceType
having powerConsumed > 10000
output every 30 sec
insert into PowerSurgeAlertStream;

Distribution build artifacts are not shown for jenkins build

Description:
There are two build artifacts in "distribution" repo of Siddhi. They are siddhi-runner-x.x.x.zip and siddhi-tooling-x.x.x.zip.

But once Jenkins release job is completed, we cannot find the above build artifacts in the Jenkins web UI. We ended up getting that from nexus repo.

We need to add below file paths to list down above files as build artifacts.

distribution/runner/target/siddhi-runner-x.x.x.zip
distribution/tooling/target/siddhi-tooling-x.x.x.zip

Support downloading docker & K8 artifacts from editor

Description:
At the moment, we use the Siddhi Tooling editor to develop Siddhi applications. Once we create a functional Siddhi application, we need to create a Siddhi Custom Resource object with the created Siddhi application to deploy it in a kubernetes environment. We need to provide support to create this Siddhi Custom Resouce object from the tooling editor itself to minimize this gap between development and deployment.

Furthermore, we need to support the user to create a docker image bundling the needed extensions and third-party dependencies from the developing environment (Siddhi Tooling editor).

My suggestion would be to the below tools in the editor.

  • Download Siddhi Kubernetes artifacts.
  • Download Docker image.

Download Siddhi Kubernetes artifacts would download a zip containing the Siddhi custom resource object and the needed pre-requisites.
Download Docker image would create a zip with the corresponding version of the runner docker build file with the needed dependencies.

Affected Product Version:
5.1.0-m1

Build Issues in Windows environment

Description:
There are some build issues are observed in windows environment as given below.

  • Issues related to some long path files in editor component (We may need to reorganize this)
  • File name case insensitive issue. There are two files with the names SAMPLE(ORDERED).csv and sample(ordered).csv ..

[Tooling] Unnecessary Siddhi app copy in the Dockerfile of the K8s export

Description:
When we export Siddhi apps as Docker, the Docker file contained a copy entry for copying Siddhi files. But when it comes to K8s export we do not need that entry because in K8s YAML file we deploy those Siddhi apps. So having that copy entry in the Dockerfile is repetitive and it may cause to some of the issues when deploying the K8s YAML.

Suggested Labels:
Bug fix

Affected Product Version:
5.1.0-beta

OS, DB, other environment details and versions:
Java version "1.8.0_201"
Java(TM) SE Runtime Environment (build 1.8.0_201-b09)

[Editor] Tooling editor failed to generate the design view

Description:
I implement the following app using tooling editor and try to view the flow diagram using the design view.

@source(
  type='http',
  receiver.url='http://0.0.0.0:8087/sample',
  basic.auth.enabled='false',
  @map(type='json')
)
define stream CurrentVehicleAreaInputStream(vehicleId string, currentArea string);

-- MongoDB store
@Store(type='mongodb', mongodb.uri='mongodb://127.0.0.1:27017/parisTma?&gssapiServiceName=mongodb')
define table subscriber(vehicleId string, currentArea string);

-- -- Retrieves events from NATS source and update in MongoDB
@info(name='update-or-insert-subscriber-location') 
from CurrentVehicleAreaInputStream
update or insert into subscriber 
on subscriber.vehicleId==vehicleId;

But it failed to generate the design view. The problem might occur when it tries to show the query. When I generate the design view only using source and store it works file.

The given error is like below.

ERROR {io.siddhi.distribution.editor.core.util.designview.utilities.ConfigBuildingUtilities} - Failed to get the string since Start index and/or End index of the SiddhiElement are/is null

Suggested Labels:
Bug fix

Affected Product Version:
5.1.0-beta

OS, DB, other environment details and versions:
Java version "1.8.0_201"
Java(TM) SE Runtime Environment (build 1.8.0_201-b09)

Steps to reproduce:
Deploy the above app using tooling and try to get the design view.

Siddhi Cloud Native CI/CD Process Implementation

Description:
We have two runtimes as Siddhi Runner and Siddhi Tooling. We need to improve how these runtimes play alongside the CI/CD process.

This also includes integration with Jenkins, Dockerhub & Kubernetes.

[Documentation] Document the default configurations used in distributions

Description:
As part of cleaning the configurations to make it simple, the default configurations will be removed from the deployment.yml. Hence, we need to document the default configurations being used during runtime for the user's reference.
It also would provide a reference to the configuration schemas for different components in a single place.

Affected Product Version:
5.1.0-m1

Null pointer exception when providng incorrect database name in rdbms:query

Description:

When providing incorrect database name this exception occurs in rdbms:query.
(rdbms:cud also has this issue)

eg : When you don't have a database named Customers and you write the query as,

from ### DataStream#rdbms:query('Customers', 'select name,amount from CustomerTable where name=?', name, 'selectName string,amount int')
select selectName,amount
insert into RecordStream;

ERROR {io.siddhi.core.SiddhiAppRuntimeImpl} - Error starting Siddhi App 'Query-rdbms', triggering shutdown process. Datasource 'Customer' cannot be connected.
ERROR {io.siddhi.core.stream.StreamJunction} - Error in 'Query-rdbms' after consuming events from Stream 'DataStream', null. Hence, dropping event 'Event{timestamp=1568885284452, data=[prabod], isExpired=false}' java.lang.NullPointerException
at io.siddhi.extension.execution.rdbms.QueryStreamProcessor.getConnection(QueryStreamProcessor.java:291)
at io.siddhi.extension.execution.rdbms.QueryStreamProcessor.process(QueryStreamProcessor.java:248)
at io.siddhi.core.query.processor.stream.StreamProcessor.processEventChunk(StreamProcessor.java:41)
at io.siddhi.core.query.processor.stream.AbstractStreamProcessor.process(AbstractStreamProcessor.java:132)
at io.siddhi.core.query.input.ProcessStreamReceiver.processAndClear(ProcessStreamReceiver.java:183)
at io.siddhi.core.query.input.ProcessStreamReceiver.process(ProcessStreamReceiver.java:90)
at io.siddhi.core.query.input.ProcessStreamReceiver.receive(ProcessStreamReceiver.java:128)
at io.siddhi.core.stream.StreamJunction.sendEvent(StreamJunction.java:199)
at io.siddhi.core.stream.StreamJunction$Publisher.send(StreamJunction.java:474)
at io.siddhi.core.stream.input.InputDistributor.send(InputDistributor.java:34)
at io.siddhi.core.stream.input.InputEntryValve.send(InputEntryValve.java:45)
at io.siddhi.core.stream.input.InputHandler.send(InputHandler.java:73)
at io.siddhi.distribution.editor.core.internal.DebuggerEventStreamService.pushEvent(DebuggerEventStreamService.java:74)
at io.siddhi.distribution.event.simulator.core.internal.generator.SingleEventGenerator.sendEvent(SingleEventGenerator.java:85)
at io.siddhi.distribution.event.simulator.core.impl.SingleApiServiceImpl.runSingleSimulation(SingleApiServiceImpl.java:20)
at io.siddhi.distribution.event.simulator.core.api.SingleApi.runSingleSimulation(SingleApi.java:72)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.wso2.msf4j.internal.router.HttpMethodInfo.invokeResource(HttpMethodInfo.java:187)
at org.wso2.msf4j.internal.router.HttpMethodInfo.invoke(HttpMethodInfo.java:143)
at org.wso2.msf4j.internal.MSF4JHttpConnectorListener.dispatchMethod(MSF4JHttpConnectorListener.java:218)
at org.wso2.msf4j.internal.MSF4JHttpConnectorListener.lambda$onMessage$58(MSF4JHttpConnectorListener.java:129)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

Osgi test failure in Appveyor build

Description:
SiddhiStoreAPITestcase.testSelectAllWithSuccessResponse() test fails intermittently since the test Siddhi app fails to deploy with No extension exist for source:tcp exception.

Affected Product Version:
master branch

OS, DB, other environment details and versions:
Appveyor

Steps to reproduce:
When Appveyor runs build for a PR

Kubernetes export error when $. characters are in the Siddhi app

Description:
When exporting a Siddhi app into K8s artifacts it gives the following error.

[2019-09-09 13:08:52,925] ERROR {io.siddhi.distribution.editor.core.internal.EditorMicroservice} - Cannot generate export-artifacts archive. java.lang.IllegalArgumentException: Illegal group reference
	at java.util.regex.Matcher.appendReplacement(Matcher.java:857)
	at java.util.regex.Matcher.replaceAll(Matcher.java:955)
	at java.lang.String.replaceAll(String.java:2223)
	at io.siddhi.distribution.editor.core.internal.ExportUtils.getKubernetesFile(ExportUtils.java:431)
	at io.siddhi.distribution.editor.core.internal.ExportUtils.createZipFile(ExportUtils.java:246)
	at io.siddhi.distribution.editor.core.internal.EditorMicroservice.exportApps(EditorMicroservice.java:1113)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.wso2.msf4j.internal.router.HttpMethodInfo.invokeResource(HttpMethodInfo.java:187)
	at org.wso2.msf4j.internal.router.HttpMethodInfo.invoke(HttpMethodInfo.java:143)
	at org.wso2.msf4j.internal.MSF4JHttpConnectorListener.dispatchMethod(MSF4JHttpConnectorListener.java:218)
	at org.wso2.msf4j.internal.MSF4JHttpConnectorListener.lambda$onMessage$58(MSF4JHttpConnectorListener.java:129)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

The reason for this error was having an HTTP service spec like below.

-- HTTP source
@source(
    type='http-service',
    source.id='adder',
    receiver.url='http://0.0.0.0:8088/comboSuperMart/promo',
    basic.auth.enabled='',
    @map(type='json', @attributes(messageId='trp:messageId', promoCardId='$.event.promoCardId', amount='$.event.amount'))
)

The special character $. is the reason for this issue and we have to escape before the replacements.

Suggested Labels:
Bug report

Affected Product Version:
5.1.0-alpha

Steps to reproduce:
Export the following sample app with HTTP service spec from Siddhi tooling editor.

-- HTTP source
@source(
    type='http-service',
    source.id='adder',
    receiver.url='http://0.0.0.0:8088/comboSuperMart/promo',
    basic.auth.enabled='',
    @map(type='json', @attributes(messageId='trp:messageId', promoCardId='$.event.promoCardId', amount='$.event.amount'))
)

[Tooling] K8s exported YAML has a wrong name format

Description:
In the process of exporting K8s artifacts, if the user did not specify name: sp-name in the last step, the name of the resulting YAML showed like below.

metadata:
  name: {{SIDDHI_PROCESS_NAME}}

When we are trying to install the YAML using kubectl it gives an error because K8s only accepts names with lower case alphanumeric characters, -, and ..

Suggested Labels:
Bug fix

Affected Product Version:
5.1.0-beta

OS, DB, other environment details and versions:
Java version "1.8.0_201"
Java(TM) SE Runtime Environment (build 1.8.0_201-b09)

Steps to reproduce:
Export Siddhi file without specifying the name.

Siddhi runner not running in Java 11

Description:
I run the Siddhi runner alpha version in Java 11 and it gives the following error.

Sep 03, 2019 4:40:53 AM org.wso2.carbon.tools.InstallJarsTool backupLibDirectory
INFO: Backed up lib to /home/siddhi_user/siddhi-runner/_lib
Sep 03, 2019 4:40:53 AM org.wso2.carbon.tools.converter.utils.BundleGeneratorUtils convertFromJarToBundle
WARNING: Path /home/siddhi_user/siddhi-runner/jars/java-nats-streaming-2.1.2.jar refers to an OSGi bundle
Sep 03, 2019 4:40:53 AM org.wso2.carbon.tools.converter.utils.BundleGeneratorUtils convertFromJarToBundle
INFO: Created the OSGi bundle java_nats_streaming_2.1.2_1.0.0.jar for JAR file /home/siddhi_user/siddhi-runner/jars/java-nats-streaming-2.1.2.jar
Sep 03, 2019 4:40:53 AM org.wso2.carbon.tools.converter.utils.BundleGeneratorUtils convertFromJarToBundle
WARNING: Path /home/siddhi_user/siddhi-runner/jars/jnats-2.3.0.jar refers to an OSGi bundle
Sep 03, 2019 4:40:53 AM org.wso2.carbon.tools.converter.utils.BundleGeneratorUtils convertFromJarToBundle
INFO: Created the OSGi bundle jnats_2.3.0_1.0.0.jar for JAR file /home/siddhi_user/siddhi-runner/jars/jnats-2.3.0.jar
Sep 03, 2019 4:40:53 AM org.wso2.carbon.tools.converter.utils.BundleGeneratorUtils convertFromJarToBundle
WARNING: Path /home/siddhi_user/siddhi-runner/jars/protobuf-java-3.6.1.jar refers to an OSGi bundle
Sep 03, 2019 4:40:53 AM org.wso2.carbon.tools.converter.utils.BundleGeneratorUtils convertFromJarToBundle
INFO: Created the OSGi bundle protobuf_java_3.6.1_1.0.0.jar for JAR file /home/siddhi_user/siddhi-runner/jars/protobuf-java-3.6.1.jar
Sep 03, 2019 4:40:53 AM org.wso2.carbon.tools.InstallJarsTool updateMetaFile
INFO: .meta successfully created on /home/siddhi_user/siddhi-runner/.meta
 Starting WSO2 Carbon (in unsupported JDK)
 [ERROR] CARBON is supported only on JDK 1.8
JAVA_HOME environment variable is set to /opt/java/openjdk
CARBON_HOME environment variable is set to /home/siddhi_user/siddhi-runner
RUNTIME_HOME environment variable is set to /home/siddhi_user/siddhi-runner/wso2/runner
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
-Djava.endorsed.dirs=/home/siddhi_user/siddhi-runner/bin/bootstrap/endorsed:/opt/java/openjdk/jre/lib/endorsed:/opt/java/openjdk/lib/endorsed is not supported. Endorsed standardsand standalone APIs
in modular form will be supported via the concept of upgradeable modules.

Here I build the Siddhi runner docker image using Java 11 alpine(adoptopenjdk/openjdk11:x86_64-alpine-jre11u-nightly) image and run it.

Suggested Labels:
Bug report

Suggested Assignees:
N/A

Affected Product Version:
Siddhi Distribution Release 5.1.0-alpha

OS, DB, other environment details and versions:
openjdk11

Steps to reproduce:

Related Issues:
N/A

Startup error for Siddhi tooling distribution

Description:
Gives below error when starting the siddhi tooling distribution in mac..

Mohanadarshans-MacBook-Pro:bin mohan$ ./tooling.sh 
JAVA_HOME environment variable is set to /Library/Java/JavaVirtualMachines/jdk1.8.0_212.jdk/Contents/Home
CARBON_HOME environment variable is set to /Users/mohan/source-code/mohan/distribution/tooling/target/siddhi-tooling-5.1.0-SNAPSHOT
RUNTIME_HOME environment variable is set to /Users/mohan/source-code/mohan/distribution/tooling/target/siddhi-tooling-5.1.0-SNAPSHOT/wso2/tooling
[2019-07-09 18:14:23,422]  INFO {org.wso2.carbon.launcher.extensions.OSGiLibBundleDeployerUtils updateOSGiLib} - Successfully updated the OSGi bundle information of Carbon Runtime: tooling  
osgi> org.osgi.framework.BundleException: Unable to acquire the state change lock for the module: osgi.identity; osgi.identity="org.ops4j.pax.logging.pax-logging-log4j2"; type="osgi.bundle"; version:Version="1.10.0" [id=5] STARTED [STARTED]
	at org.eclipse.osgi.container.Module.lockStateChange(Module.java:337)
	at org.eclipse.osgi.container.Module.start(Module.java:401)
	at org.eclipse.osgi.internal.framework.EquinoxBundle.start(EquinoxBundle.java:383)
	at org.eclipse.osgi.internal.framework.EquinoxBundle.start(EquinoxBundle.java:402)
	at org.eclipse.equinox.internal.simpleconfigurator.ConfigApplier.startBundles(ConfigApplier.java:453)
	at org.eclipse.equinox.internal.simpleconfigurator.ConfigApplier.install(ConfigApplier.java:111)
	at org.eclipse.equinox.internal.simpleconfigurator.SimpleConfiguratorImpl.applyConfiguration(SimpleConfiguratorImpl.java:191)
	at org.eclipse.equinox.internal.simpleconfigurator.SimpleConfiguratorImpl.applyConfiguration(SimpleConfiguratorImpl.java:205)
	at org.eclipse.equinox.internal.simpleconfigurator.Activator.start(Activator.java:60)
	at org.eclipse.osgi.internal.framework.BundleContextImpl$3.run(BundleContextImpl.java:774)
	at org.eclipse.osgi.internal.framework.BundleContextImpl$3.run(BundleContextImpl.java:1)
	at java.security.AccessController.doPrivileged(Native Method)
	at org.eclipse.osgi.internal.framework.BundleContextImpl.startActivator(BundleContextImpl.java:767)
	at org.eclipse.osgi.internal.framework.BundleContextImpl.start(BundleContextImpl.java:724)
	at org.eclipse.osgi.internal.framework.EquinoxBundle.startWorker0(EquinoxBundle.java:932)
	at org.eclipse.osgi.internal.framework.EquinoxBundle$EquinoxModule.startWorker(EquinoxBundle.java:309)
	at org.eclipse.osgi.container.Module.doStart(Module.java:581)
	at org.eclipse.osgi.container.Module.start(Module.java:449)
	at org.eclipse.osgi.internal.framework.EquinoxBundle.start(EquinoxBundle.java:383)
	at org.eclipse.osgi.internal.framework.EquinoxBundle.start(EquinoxBundle.java:402)
	at org.wso2.carbon.launcher.CarbonServer.loadInitialBundles(CarbonServer.java:242)
	at org.wso2.carbon.launcher.CarbonServer.start(CarbonServer.java:83)
	at org.wso2.carbon.launcher.Main.main(Main.java:84)
Caused by: java.util.concurrent.TimeoutException: Timeout after waiting 5 seconds to acquire the lock.
	at org.eclipse.osgi.container.Module.lockStateChange(Module.java:334)
	... 22 more
[2019-07-09 18:14:49,939]  INFO {org.wso2.msf4j.internal.websocket.WebSocketServerSC} - All required capabilities are available of WebSocket service component is available.
[2019-07-09 18:14:49,961]  INFO {org.wso2.carbon.metrics.core.config.model.JmxReporterConfig} - Creating JMX reporter for Metrics with domain 'org.wso2.carbon.metrics'
[2019-07-09 18:14:49,980]  INFO {org.wso2.msf4j.analytics.metrics.MetricsComponent} - Metrics Component is activated
[2019-07-09 18:14:49,982]  INFO {org.wso2.carbon.databridge.agent.internal.DataAgentDS} - Successfully deployed Agent Server 
[2019-07-09 18:14:50,089]  INFO {io.siddhi.distribution.editor.core.internal.WorkspaceDeployer} - Workspace artifact deployer initiated.
[2019-07-09 18:14:50,093]  INFO {io.siddhi.distribution.event.simulator.core.service.CSVFileDeployer} - CSV file deployer initiated.
[2019-07-09 18:14:50,095]  INFO {io.siddhi.distribution.event.simulator.core.service.SimulationConfigDeployer} - Simulation config deployer initiated.

osgi> 
osgi> [2019-07-09 18:15:09,647]  INFO {io.siddhi.distribution.editor.core.internal.StartupComponent} - Editor Started on : http://10.100.0.99:9390/editor
[2019-07-09 18:15:09,647]  INFO {org.wso2.msf4j.internal.MicroservicesServerSC} - All microservices are available

osgi> 
osgi> [2019-07-09 18:15:14,710]  INFO {org.wso2.transport.http.netty.contractimpl.listener.ServerConnectorBootstrap$HttpServerConnector} - HTTP(S) Interface starting on host 0.0.0.0 and port 9390
[2019-07-09 18:15:14,710]  INFO {org.wso2.transport.http.netty.contractimpl.listener.ServerConnectorBootstrap$HttpServerConnector} - HTTP(S) Interface starting on host 0.0.0.0 and port 9743
[2019-07-09 18:15:14,739]  INFO {org.wso2.carbon.databridge.receiver.binary.internal.BinaryDataReceiverServiceComponent} - org.wso2.carbon.databridge.receiver.binary.internal.Service Component is activated
[2019-07-09 18:15:14,742]  INFO {org.wso2.carbon.databridge.receiver.thrift.internal.ThriftDataReceiverDS} - Service Component is activated
[2019-07-09 18:15:14,745]  INFO {org.wso2.carbon.kernel.internal.CarbonStartupHandler} - Siddhi Tooling Distribution started in 56.678 sec

[Tooling] Enable docker build and push using editor

Description:
Currently Siddhi tooling support automatic export of Siddhi apps to Docker and K8s. When it comes to K8s export, current export facility enable the user to go through the following process.

  1. Export Siddhi apps as K8s.
  2. Build the custom Docker image using docker build -t <IMAGE> .
  3. Push that Docker image to a user Docker registry
  4. Add the Docker image name to the spec.container.image spec in the siddhi-process.yaml file.
  5. Then apply the siddhi-process.yaml using kubectl

Anyway, this is a quite long process to deploy the exported Siddhi apps.

If Siddhi tooling can build the Docker image by itself and push into a user given Docker registry account, we can reduce this into two steps as below.

  1. In the K8s export process, the user gives credentials to push the Docker image and Siddhi tooling will push the Docker image.
  2. Then the user can just apply the siddhi-process.yaml using kubectl

Suggested Labels:
Type improvement

Affected Product Version:
5.1.0-beta

Related Issues:
#247

[Siddhi Runner] -Dapps and -Dconfig fails with relative paths

Description:
When we use relative paths as values for -Dapps/-Dconfig system arguments while starting Siddhi Runner, the corresponding functionalities fail.

Affected Product Version:
Siddhi Runner 5.1.0-m2

Steps to reproduce:

  • Copy a Sample Siddhi file(SampleApp.siddhi) into SIDDHI_RUNNER_HOME/wso2 directory.
  • Go to SIDDHI_RUNNER_HOME/bin directory and execute the following command.
    ./runner.sh -Dapps=../wso2/SampleApp.siddhi

Errors thrown during hard refresh of the UI

Description:
Following error is thrown when the hard refresh of the browser
osgi> [2019-03-16 21:16:28,026] ERROR {org.wso2.transport.http.netty.common.Util} - Remote client closed the connection before completing outbound response java.io.IOException: Remote client closed the connection before completing outbound response at org.wso2.transport.http.netty.common.Util.lambda$checkForResponseWriteStatus$9(Util.java:599) at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507) at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:481) at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420) at io.netty.util.concurrent.DefaultPromise.addListener(DefaultPromise.java:163) at io.netty.channel.DefaultChannelPromise.addListener(DefaultChannelPromise.java:93) at io.netty.channel.DefaultChannelPromise.addListener(DefaultChannelPromise.java:28) at org.wso2.transport.http.netty.common.Util.checkForResponseWriteStatus(Util.java:595) at org.wso2.transport.http.netty.contractimpl.HttpOutboundRespListener.writeOutboundResponseHeaderAndBody(HttpOutboundRespListener.java:187) at org.wso2.transport.http.netty.contractimpl.HttpOutboundRespListener.writeOutboundResponse(HttpOutboundRespListener.java:138) at org.wso2.transport.http.netty.contractimpl.HttpOutboundRespListener.lambda$null$35(HttpOutboundRespListener.java:94) at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:748)

Related Issues:
wso2/product-sp#792

Moving extension samples to tooling distribution

Description:
With Siddhi Tooling distribution v5.1.0-m2, we are only bundling 9 mandatory samples. It would be easier for the user, if we move the samples specific to extensions from samples/artifacts.
Before moving them, we would have to clean the descriptions.

Affected Product Version:
v5.1.0-m2

Fix File operation specific test cases in Window OS

Description:
Below test cases are failing in Windows OS due to file specific operation. We should fix this ASAP.

[00:47:38] testDeleteFilesApi(io.siddhi.distribution.test.osgi.SimulatorAPITestcase)  Time elapsed: 0.044 sec  <<< FAILURE!
[00:47:38] java.lang.AssertionError: expected [200] but found [500]
[00:47:38] 	at org.testng.Assert.fail(Assert.java:94)
[00:47:38] 	at org.testng.Assert.failNotEquals(Assert.java:496)
[00:47:38] 	at org.testng.Assert.assertEquals(Assert.java:125)
[00:47:38] 	at org.testng.Assert.assertEquals(Assert.java:372)
[00:47:38] 	at org.testng.Assert.assertEquals(Assert.java:382)
[00:47:38] 	at io.siddhi.distribution.test.osgi.SimulatorAPITestcase.testDeleteFilesApi(SimulatorAPITestcase.java:303)
[00:47:38] 
[00:47:38] testListingDirectories(io.siddhi.distribution.test.osgi.SiddhiEditorTestCase)  Time elapsed: 0.254 sec  <<< FAILURE!
[00:47:38] java.lang.AssertionError: expected [200] but found [500]
[00:47:38] 	at org.testng.Assert.fail(Assert.java:94)
[00:47:38] 	at org.testng.Assert.failNotEquals(Assert.java:496)
[00:47:38] 	at org.testng.Assert.assertEquals(Assert.java:125)
[00:47:38] 	at org.testng.Assert.assertEquals(Assert.java:372)
[00:47:38] 	at org.testng.Assert.assertEquals(Assert.java:382)
[00:47:38] 	at io.siddhi.distribution.test.osgi.SiddhiEditorTestCase.testListingDirectories(SiddhiEditorTestCase.java:139)
[00:47:38] 
[00:47:39] 
[00:47:39] Results :
[00:47:39] 
[00:47:39] Failed tests: 
[00:47:39]   SiddhiEditorTestCase.testListingDirectories:139 expected [200] but found [500]
[00:47:39]   SimulatorAPITestcase.testDeleteFilesApi:303 expected [200] but found [500]

[Exporting to Docker/K8] Zip file with same name

Description:
Zip file created from the K8 export feature has the same name as the file exported through Docker export feature

Affected Product Version:
siddhi-tooling-5.1.0-alpha

Steps to reproduce:

  1. Create a siddhi app and use the K8 export and Docker export feature
  2. Follow through the steps and export the file

Assignees
@BuddhiWathsala

[Siddhi Parser] While parsing the Siddhi app, parser connects to network endpoints.

Description:
When a Siddhi app is passed through Siddhi Custom resource object for kubernetes deployment, we use Siddhi Parser to get the necessary information extracted from the app for Siddhi Operator to create the deployment.
While parsing the Siddhi app, the parser starts a SiddhiAppRuntime, which causes it to create network connections. If any network connection fails, the parsing fails.
We cannot use the createSandboxSiddhiAppRuntime(), since it removes all network endpoints(source/sinkns,..) which would make the parsing incomplete.

Affected Product Version:
5.1.0-alpha

Documentation on CI/CD Story of Siddhi

Description:
Need a proper documentation/article which explains the end to end CI/CD story of Siddhi. This document will be the guide for external users to achieve the complete CI/CD cycle with Siddhi.

Workspace did not detect in tooling

Description:
When starting the siddhi tooling, the message popped out as "Unable to read sample file" as below.
unable-to-read-sample

When trying to access the workspace, the workspace did not detected by the tool.
workspace-not-detected

However, I can open files in the workspace using File->Open file.

Enable graceful shutdown for the runner

Description:
AFAIK there is no proper mechanism to shutdown the Siddhi runner gracefully. I deployed the Siddhi runner in a K8s cluster. I had a requirement to enable zero downtime to the Siddhi runner deployment in K8s. I deployed the Siddhi app with HTTP source and route traffic using ingress NGINX controller.

Sometimes Siddhi runner terminates and another Siddhi runner deployment starting to consume HTTP requests. The terminating Siddhi runner has already established HTTP connections. But unfortunately, Siddhi runner shut down immediately without a response to the established connections. That leads to a 502 bad gateway in client-side.

In the termination, Siddhi runner gets a SIGTERM from K8s. At that time Siddhi runner should finish all the running threads and then starts to terminate.

Suggested Labels:
type/fix

Affected Product Version:
v5.1.0-m1

Steps to reproduce:

  1. Deploy HTTP Siddhi app using Siddhi operator
  2. Send continuous HTTP events
  3. Update your Siddhi app and redeploy it in the K8s cluster
  4. You will see some requests getting 502

Runner deploy unwanted files in the workspace directory

Description:
My <SIDDHI-RUNNER-HOME>/wso2/worker/deployment/siddhi-files/ directory contains some of the system link sub-directories and files such as

drwxr-xr-x 2 root root 4096 Mar 18 10:06 ..2019_03_18_10_06_42.481700548
lrwxrwxrwx 1 root root 31 Mar 18 10:06 ..data -> ..2019_03_18_10_06_42.481700548

apart from my *.siddhi files. When I tried to deploy siddhi files in the <SIDDHI-RUNNER-HOME>/wso2/worker/deployment/siddhi-files directory, it tries to deploy that system link sub-directories and files, then gives an error as,

[2019-03-18 10:06:50,993] ERROR {io.siddhi.distribution.core.core.internal.StreamProcessorDeployer} - /home/siddhi-runner-1.0.0-SNAPSHOT/wso2/worker/deployment/siddhi-files/..data (Is a directory) java.io.FileNotFoundException: /home/siddhi-runner-1.0.0-SNAPSHOT/wso2/worker/deployment/siddhi-files/..data (Is a directory)

So, it's better to have the capability to filter only the siddhi files from <SIDDHI-RUNNER-HOME>/wso2/worker/deployment/siddhi-files directory and deploy, instead of deploying all the files in that directory.

Editor Improvements related to CI/CD Story

Description:
Due to the improvements related to CI/CD of Siddhi, there are some improvements/changes required in the Siddhi editor. Hence, this issue is created to track that.

Removing data bridge from distribution

Description:
Remove unwanted data bridge feature. Data bridge will be no longer used in runner and tooling. Most of the dependencies of data bridge remove from distribution.

That removal will make the final packs lighter.

Affected Product Version:
5.1.0-m2

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.