siddhi-io / distribution Goto Github PK
View Code? Open in Web Editor NEWSiddhi streaming runtime and tooling distribution
Home Page: http://siddhi.io
License: Apache License 2.0
Siddhi streaming runtime and tooling distribution
Home Page: http://siddhi.io
License: Apache License 2.0
Description:
When we deploy Siddhi apps with NATS source using Siddhi operator the service creation does not happen. The reason was parser did not return configurations for the service creation when we have TCP sources. For users, we are recommending to disable ingress creation for TCP sources and create ingress manually. Since this service configuration does not return by the parser lead the user to not only create ingress but also to create K8s service by themselves. Therefore it is better to return TCP ports from the parser.
Suggested Labels:
Feature improvement
Affected Product Version:
5.1.0-alpha
OS, DB, other environment details and versions:
siddhi-operator:0.2.0-alpha
Steps to reproduce:
Deploy Siddhi app like below using Siddhi operator.
@App:name("PowerConsumptionSurgeDetection")
@App:description("App consumes events from HTTP as a JSON message of { 'deviceType': 'dryer', 'power': 6000 } format and inserts the events into DevicePowerStream, and alerts the user if the power consumption in 1 minute is greater than or equal to 10000W by printing a message in the log for every 30 seconds.")
/*
Input: deviceType string and powerConsuption int(Joules)
Output: Alert user from printing a log, if there is a power surge in the dryer within 1 minute period.
Notify the user in every 30 seconds when total power consumption is greater than or equal to 10000W in 1 minute time period.
*/
@source(
type='nats',
cluster.id='siddhi-stan',
destination = 'PowerStream',
bootstrap.servers='nats://siddhi-nats:4222',
@map(type='text')
)
define stream DevicePowerStream(deviceType string, power int);
@sink(type='log', prefix='LOGGER')
define stream PowerSurgeAlertStream(deviceType string, powerConsumed long);
@info(name='surge-detector')
from DevicePowerStream#window.time(1 min)
select deviceType, sum(power) as powerConsumed
group by deviceType
having powerConsumed > 10000
output every 30 sec
insert into PowerSurgeAlertStream;
Description:
Remove Siddhi namespace being used in deployment.yml in refs and system properties and bring those configs under the root namespace.
Affected Product Version:
5.1.0-m1
Description:
There are two build artifacts in "distribution" repo of Siddhi. They are siddhi-runner-x.x.x.zip and siddhi-tooling-x.x.x.zip.
But once Jenkins release job is completed, we cannot find the above build artifacts in the Jenkins web UI. We ended up getting that from nexus repo.
We need to add below file paths to list down above files as build artifacts.
distribution/runner/target/siddhi-runner-x.x.x.zip
distribution/tooling/target/siddhi-tooling-x.x.x.zip
Description:
At the moment, we use the Siddhi Tooling editor to develop Siddhi applications. Once we create a functional Siddhi application, we need to create a Siddhi Custom Resource object with the created Siddhi application to deploy it in a kubernetes environment. We need to provide support to create this Siddhi Custom Resouce object from the tooling editor itself to minimize this gap between development and deployment.
Furthermore, we need to support the user to create a docker image bundling the needed extensions and third-party dependencies from the developing environment (Siddhi Tooling editor).
My suggestion would be to the below tools in the editor.
Download Siddhi Kubernetes artifacts would download a zip containing the Siddhi custom resource object and the needed pre-requisites.
Download Docker image would create a zip with the corresponding version of the runner docker build file with the needed dependencies.
Affected Product Version:
5.1.0-m1
Description:
There are some build issues are observed in windows environment as given below.
Description:
$subject, Error log,
Error: Could not find or load main class org.wso2.carbon.tools.CarbonToolExecutor
Affected Product Version:
5.1.0-M2
Steps to reproduce:
Description:
When we export Siddhi apps as Docker, the Docker file contained a copy entry for copying Siddhi files. But when it comes to K8s export we do not need that entry because in K8s YAML file we deploy those Siddhi apps. So having that copy entry in the Dockerfile is repetitive and it may cause to some of the issues when deploying the K8s YAML.
Suggested Labels:
Bug fix
Affected Product Version:
5.1.0-beta
OS, DB, other environment details and versions:
Java version "1.8.0_201"
Java(TM) SE Runtime Environment (build 1.8.0_201-b09)
Description:
$subject, if the export microservice call fails, the user is not notified and only console log is available
Affected Product Version:
5.1.0-alpha
Steps to reproduce:
Description:
Include a readme file in docker and k8 export artifacts adding the steps to run.
Description:
I implement the following app using tooling editor and try to view the flow diagram using the design view.
@source(
type='http',
receiver.url='http://0.0.0.0:8087/sample',
basic.auth.enabled='false',
@map(type='json')
)
define stream CurrentVehicleAreaInputStream(vehicleId string, currentArea string);
-- MongoDB store
@Store(type='mongodb', mongodb.uri='mongodb://127.0.0.1:27017/parisTma?&gssapiServiceName=mongodb')
define table subscriber(vehicleId string, currentArea string);
-- -- Retrieves events from NATS source and update in MongoDB
@info(name='update-or-insert-subscriber-location')
from CurrentVehicleAreaInputStream
update or insert into subscriber
on subscriber.vehicleId==vehicleId;
But it failed to generate the design view. The problem might occur when it tries to show the query. When I generate the design view only using source and store it works file.
The given error is like below.
ERROR {io.siddhi.distribution.editor.core.util.designview.utilities.ConfigBuildingUtilities} - Failed to get the string since Start index and/or End index of the SiddhiElement are/is null
Suggested Labels:
Bug fix
Affected Product Version:
5.1.0-beta
OS, DB, other environment details and versions:
Java version "1.8.0_201"
Java(TM) SE Runtime Environment (build 1.8.0_201-b09)
Steps to reproduce:
Deploy the above app using tooling and try to get the design view.
Description:
We have two runtimes as Siddhi Runner and Siddhi Tooling. We need to improve how these runtimes play alongside the CI/CD process.
This also includes integration with Jenkins, Dockerhub & Kubernetes.
Description:
As part of cleaning the configurations to make it simple, the default configurations will be removed from the deployment.yml. Hence, we need to document the default configurations being used during runtime for the user's reference.
It also would provide a reference to the configuration schemas for different components in a single place.
Affected Product Version:
5.1.0-m1
Description:
$subject as the current set of APIs have redundancies and complex path params
Affected Product Version:
5.1.0- Beta
Description:
The auto-completion stops working after a misspelling. Cannot make it work unless you start the statement from the beginning.
Affected Product Version:
5.1.0-m1
Description:
When providing incorrect database name this exception occurs in rdbms:query.
(rdbms:cud also has this issue)
eg : When you don't have a database named Customers and you write the query as,
from ### DataStream#rdbms:query('Customers', 'select name,amount from CustomerTable where name=?', name, 'selectName string,amount int')
select selectName,amount
insert into RecordStream;
ERROR {io.siddhi.core.SiddhiAppRuntimeImpl} - Error starting Siddhi App 'Query-rdbms', triggering shutdown process. Datasource 'Customer' cannot be connected.
ERROR {io.siddhi.core.stream.StreamJunction} - Error in 'Query-rdbms' after consuming events from Stream 'DataStream', null. Hence, dropping event 'Event{timestamp=1568885284452, data=[prabod], isExpired=false}' java.lang.NullPointerException
at io.siddhi.extension.execution.rdbms.QueryStreamProcessor.getConnection(QueryStreamProcessor.java:291)
at io.siddhi.extension.execution.rdbms.QueryStreamProcessor.process(QueryStreamProcessor.java:248)
at io.siddhi.core.query.processor.stream.StreamProcessor.processEventChunk(StreamProcessor.java:41)
at io.siddhi.core.query.processor.stream.AbstractStreamProcessor.process(AbstractStreamProcessor.java:132)
at io.siddhi.core.query.input.ProcessStreamReceiver.processAndClear(ProcessStreamReceiver.java:183)
at io.siddhi.core.query.input.ProcessStreamReceiver.process(ProcessStreamReceiver.java:90)
at io.siddhi.core.query.input.ProcessStreamReceiver.receive(ProcessStreamReceiver.java:128)
at io.siddhi.core.stream.StreamJunction.sendEvent(StreamJunction.java:199)
at io.siddhi.core.stream.StreamJunction$Publisher.send(StreamJunction.java:474)
at io.siddhi.core.stream.input.InputDistributor.send(InputDistributor.java:34)
at io.siddhi.core.stream.input.InputEntryValve.send(InputEntryValve.java:45)
at io.siddhi.core.stream.input.InputHandler.send(InputHandler.java:73)
at io.siddhi.distribution.editor.core.internal.DebuggerEventStreamService.pushEvent(DebuggerEventStreamService.java:74)
at io.siddhi.distribution.event.simulator.core.internal.generator.SingleEventGenerator.sendEvent(SingleEventGenerator.java:85)
at io.siddhi.distribution.event.simulator.core.impl.SingleApiServiceImpl.runSingleSimulation(SingleApiServiceImpl.java:20)
at io.siddhi.distribution.event.simulator.core.api.SingleApi.runSingleSimulation(SingleApi.java:72)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.wso2.msf4j.internal.router.HttpMethodInfo.invokeResource(HttpMethodInfo.java:187)
at org.wso2.msf4j.internal.router.HttpMethodInfo.invoke(HttpMethodInfo.java:143)
at org.wso2.msf4j.internal.MSF4JHttpConnectorListener.dispatchMethod(MSF4JHttpConnectorListener.java:218)
at org.wso2.msf4j.internal.MSF4JHttpConnectorListener.lambda$onMessage$58(MSF4JHttpConnectorListener.java:129)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Description:
SiddhiStoreAPITestcase.testSelectAllWithSuccessResponse() test fails intermittently since the test Siddhi app fails to deploy with No extension exist for source:tcp exception.
Affected Product Version:
master branch
OS, DB, other environment details and versions:
Appveyor
Steps to reproduce:
When Appveyor runs build for a PR
Description:
When exporting a Siddhi app into K8s artifacts it gives the following error.
[2019-09-09 13:08:52,925] ERROR {io.siddhi.distribution.editor.core.internal.EditorMicroservice} - Cannot generate export-artifacts archive. java.lang.IllegalArgumentException: Illegal group reference
at java.util.regex.Matcher.appendReplacement(Matcher.java:857)
at java.util.regex.Matcher.replaceAll(Matcher.java:955)
at java.lang.String.replaceAll(String.java:2223)
at io.siddhi.distribution.editor.core.internal.ExportUtils.getKubernetesFile(ExportUtils.java:431)
at io.siddhi.distribution.editor.core.internal.ExportUtils.createZipFile(ExportUtils.java:246)
at io.siddhi.distribution.editor.core.internal.EditorMicroservice.exportApps(EditorMicroservice.java:1113)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.wso2.msf4j.internal.router.HttpMethodInfo.invokeResource(HttpMethodInfo.java:187)
at org.wso2.msf4j.internal.router.HttpMethodInfo.invoke(HttpMethodInfo.java:143)
at org.wso2.msf4j.internal.MSF4JHttpConnectorListener.dispatchMethod(MSF4JHttpConnectorListener.java:218)
at org.wso2.msf4j.internal.MSF4JHttpConnectorListener.lambda$onMessage$58(MSF4JHttpConnectorListener.java:129)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
The reason for this error was having an HTTP service spec like below.
-- HTTP source
@source(
type='http-service',
source.id='adder',
receiver.url='http://0.0.0.0:8088/comboSuperMart/promo',
basic.auth.enabled='',
@map(type='json', @attributes(messageId='trp:messageId', promoCardId='$.event.promoCardId', amount='$.event.amount'))
)
The special character $.
is the reason for this issue and we have to escape before the replacements.
Suggested Labels:
Bug report
Affected Product Version:
5.1.0-alpha
Steps to reproduce:
Export the following sample app with HTTP service spec from Siddhi tooling editor.
-- HTTP source
@source(
type='http-service',
source.id='adder',
receiver.url='http://0.0.0.0:8088/comboSuperMart/promo',
basic.auth.enabled='',
@map(type='json', @attributes(messageId='trp:messageId', promoCardId='$.event.promoCardId', amount='$.event.amount'))
)
Description:
In the process of exporting K8s artifacts, if the user did not specify name: sp-name
in the last step, the name of the resulting YAML showed like below.
metadata:
name: {{SIDDHI_PROCESS_NAME}}
When we are trying to install the YAML using kubectl
it gives an error because K8s only accepts names with lower case alphanumeric characters, -
, and .
.
Suggested Labels:
Bug fix
Affected Product Version:
5.1.0-beta
OS, DB, other environment details and versions:
Java version "1.8.0_201"
Java(TM) SE Runtime Environment (build 1.8.0_201-b09)
Steps to reproduce:
Export Siddhi file without specifying the name.
Description:
Redundant logic to verify server type based deployments must be removed
Affected Product Version:
5.0.0 - 5.1.0
Description:
$subject as it is only supported with databridge transports. Either MSF4J needs to support this configuration or this configuration needs to be depreciated
Affected Product Version:
v5.1.0-m2
Description:
I run the Siddhi runner alpha version in Java 11 and it gives the following error.
Sep 03, 2019 4:40:53 AM org.wso2.carbon.tools.InstallJarsTool backupLibDirectory
INFO: Backed up lib to /home/siddhi_user/siddhi-runner/_lib
Sep 03, 2019 4:40:53 AM org.wso2.carbon.tools.converter.utils.BundleGeneratorUtils convertFromJarToBundle
WARNING: Path /home/siddhi_user/siddhi-runner/jars/java-nats-streaming-2.1.2.jar refers to an OSGi bundle
Sep 03, 2019 4:40:53 AM org.wso2.carbon.tools.converter.utils.BundleGeneratorUtils convertFromJarToBundle
INFO: Created the OSGi bundle java_nats_streaming_2.1.2_1.0.0.jar for JAR file /home/siddhi_user/siddhi-runner/jars/java-nats-streaming-2.1.2.jar
Sep 03, 2019 4:40:53 AM org.wso2.carbon.tools.converter.utils.BundleGeneratorUtils convertFromJarToBundle
WARNING: Path /home/siddhi_user/siddhi-runner/jars/jnats-2.3.0.jar refers to an OSGi bundle
Sep 03, 2019 4:40:53 AM org.wso2.carbon.tools.converter.utils.BundleGeneratorUtils convertFromJarToBundle
INFO: Created the OSGi bundle jnats_2.3.0_1.0.0.jar for JAR file /home/siddhi_user/siddhi-runner/jars/jnats-2.3.0.jar
Sep 03, 2019 4:40:53 AM org.wso2.carbon.tools.converter.utils.BundleGeneratorUtils convertFromJarToBundle
WARNING: Path /home/siddhi_user/siddhi-runner/jars/protobuf-java-3.6.1.jar refers to an OSGi bundle
Sep 03, 2019 4:40:53 AM org.wso2.carbon.tools.converter.utils.BundleGeneratorUtils convertFromJarToBundle
INFO: Created the OSGi bundle protobuf_java_3.6.1_1.0.0.jar for JAR file /home/siddhi_user/siddhi-runner/jars/protobuf-java-3.6.1.jar
Sep 03, 2019 4:40:53 AM org.wso2.carbon.tools.InstallJarsTool updateMetaFile
INFO: .meta successfully created on /home/siddhi_user/siddhi-runner/.meta
Starting WSO2 Carbon (in unsupported JDK)
[ERROR] CARBON is supported only on JDK 1.8
JAVA_HOME environment variable is set to /opt/java/openjdk
CARBON_HOME environment variable is set to /home/siddhi_user/siddhi-runner
RUNTIME_HOME environment variable is set to /home/siddhi_user/siddhi-runner/wso2/runner
Error: Could not create the Java Virtual Machine.
Error: A fatal exception has occurred. Program will exit.
-Djava.endorsed.dirs=/home/siddhi_user/siddhi-runner/bin/bootstrap/endorsed:/opt/java/openjdk/jre/lib/endorsed:/opt/java/openjdk/lib/endorsed is not supported. Endorsed standardsand standalone APIs
in modular form will be supported via the concept of upgradeable modules.
Here I build the Siddhi runner docker image using Java 11 alpine(adoptopenjdk/openjdk11:x86_64-alpine-jre11u-nightly) image and run it.
Suggested Labels:
Bug report
Suggested Assignees:
N/A
Affected Product Version:
Siddhi Distribution Release 5.1.0-alpha
OS, DB, other environment details and versions:
openjdk11
Steps to reproduce:
Related Issues:
N/A
Description:
Gives below error when starting the siddhi tooling distribution in mac..
Mohanadarshans-MacBook-Pro:bin mohan$ ./tooling.sh
JAVA_HOME environment variable is set to /Library/Java/JavaVirtualMachines/jdk1.8.0_212.jdk/Contents/Home
CARBON_HOME environment variable is set to /Users/mohan/source-code/mohan/distribution/tooling/target/siddhi-tooling-5.1.0-SNAPSHOT
RUNTIME_HOME environment variable is set to /Users/mohan/source-code/mohan/distribution/tooling/target/siddhi-tooling-5.1.0-SNAPSHOT/wso2/tooling
[2019-07-09 18:14:23,422] INFO {org.wso2.carbon.launcher.extensions.OSGiLibBundleDeployerUtils updateOSGiLib} - Successfully updated the OSGi bundle information of Carbon Runtime: tooling
osgi> org.osgi.framework.BundleException: Unable to acquire the state change lock for the module: osgi.identity; osgi.identity="org.ops4j.pax.logging.pax-logging-log4j2"; type="osgi.bundle"; version:Version="1.10.0" [id=5] STARTED [STARTED]
at org.eclipse.osgi.container.Module.lockStateChange(Module.java:337)
at org.eclipse.osgi.container.Module.start(Module.java:401)
at org.eclipse.osgi.internal.framework.EquinoxBundle.start(EquinoxBundle.java:383)
at org.eclipse.osgi.internal.framework.EquinoxBundle.start(EquinoxBundle.java:402)
at org.eclipse.equinox.internal.simpleconfigurator.ConfigApplier.startBundles(ConfigApplier.java:453)
at org.eclipse.equinox.internal.simpleconfigurator.ConfigApplier.install(ConfigApplier.java:111)
at org.eclipse.equinox.internal.simpleconfigurator.SimpleConfiguratorImpl.applyConfiguration(SimpleConfiguratorImpl.java:191)
at org.eclipse.equinox.internal.simpleconfigurator.SimpleConfiguratorImpl.applyConfiguration(SimpleConfiguratorImpl.java:205)
at org.eclipse.equinox.internal.simpleconfigurator.Activator.start(Activator.java:60)
at org.eclipse.osgi.internal.framework.BundleContextImpl$3.run(BundleContextImpl.java:774)
at org.eclipse.osgi.internal.framework.BundleContextImpl$3.run(BundleContextImpl.java:1)
at java.security.AccessController.doPrivileged(Native Method)
at org.eclipse.osgi.internal.framework.BundleContextImpl.startActivator(BundleContextImpl.java:767)
at org.eclipse.osgi.internal.framework.BundleContextImpl.start(BundleContextImpl.java:724)
at org.eclipse.osgi.internal.framework.EquinoxBundle.startWorker0(EquinoxBundle.java:932)
at org.eclipse.osgi.internal.framework.EquinoxBundle$EquinoxModule.startWorker(EquinoxBundle.java:309)
at org.eclipse.osgi.container.Module.doStart(Module.java:581)
at org.eclipse.osgi.container.Module.start(Module.java:449)
at org.eclipse.osgi.internal.framework.EquinoxBundle.start(EquinoxBundle.java:383)
at org.eclipse.osgi.internal.framework.EquinoxBundle.start(EquinoxBundle.java:402)
at org.wso2.carbon.launcher.CarbonServer.loadInitialBundles(CarbonServer.java:242)
at org.wso2.carbon.launcher.CarbonServer.start(CarbonServer.java:83)
at org.wso2.carbon.launcher.Main.main(Main.java:84)
Caused by: java.util.concurrent.TimeoutException: Timeout after waiting 5 seconds to acquire the lock.
at org.eclipse.osgi.container.Module.lockStateChange(Module.java:334)
... 22 more
[2019-07-09 18:14:49,939] INFO {org.wso2.msf4j.internal.websocket.WebSocketServerSC} - All required capabilities are available of WebSocket service component is available.
[2019-07-09 18:14:49,961] INFO {org.wso2.carbon.metrics.core.config.model.JmxReporterConfig} - Creating JMX reporter for Metrics with domain 'org.wso2.carbon.metrics'
[2019-07-09 18:14:49,980] INFO {org.wso2.msf4j.analytics.metrics.MetricsComponent} - Metrics Component is activated
[2019-07-09 18:14:49,982] INFO {org.wso2.carbon.databridge.agent.internal.DataAgentDS} - Successfully deployed Agent Server
[2019-07-09 18:14:50,089] INFO {io.siddhi.distribution.editor.core.internal.WorkspaceDeployer} - Workspace artifact deployer initiated.
[2019-07-09 18:14:50,093] INFO {io.siddhi.distribution.event.simulator.core.service.CSVFileDeployer} - CSV file deployer initiated.
[2019-07-09 18:14:50,095] INFO {io.siddhi.distribution.event.simulator.core.service.SimulationConfigDeployer} - Simulation config deployer initiated.
osgi>
osgi> [2019-07-09 18:15:09,647] INFO {io.siddhi.distribution.editor.core.internal.StartupComponent} - Editor Started on : http://10.100.0.99:9390/editor
[2019-07-09 18:15:09,647] INFO {org.wso2.msf4j.internal.MicroservicesServerSC} - All microservices are available
osgi>
osgi> [2019-07-09 18:15:14,710] INFO {org.wso2.transport.http.netty.contractimpl.listener.ServerConnectorBootstrap$HttpServerConnector} - HTTP(S) Interface starting on host 0.0.0.0 and port 9390
[2019-07-09 18:15:14,710] INFO {org.wso2.transport.http.netty.contractimpl.listener.ServerConnectorBootstrap$HttpServerConnector} - HTTP(S) Interface starting on host 0.0.0.0 and port 9743
[2019-07-09 18:15:14,739] INFO {org.wso2.carbon.databridge.receiver.binary.internal.BinaryDataReceiverServiceComponent} - org.wso2.carbon.databridge.receiver.binary.internal.Service Component is activated
[2019-07-09 18:15:14,742] INFO {org.wso2.carbon.databridge.receiver.thrift.internal.ThriftDataReceiverDS} - Service Component is activated
[2019-07-09 18:15:14,745] INFO {org.wso2.carbon.kernel.internal.CarbonStartupHandler} - Siddhi Tooling Distribution started in 56.678 sec
Description:
Currently Siddhi tooling support automatic export of Siddhi apps to Docker and K8s. When it comes to K8s export, current export facility enable the user to go through the following process.
docker build -t <IMAGE> .
spec.container.image
spec in the siddhi-process.yaml
file.siddhi-process.yaml
using kubectl
Anyway, this is a quite long process to deploy the exported Siddhi apps.
If Siddhi tooling can build the Docker image by itself and push into a user given Docker registry account, we can reduce this into two steps as below.
siddhi-process.yaml
using kubectl
Suggested Labels:
Type improvement
Affected Product Version:
5.1.0-beta
Related Issues:
#247
Description:
When we use relative paths as values for -Dapps/-Dconfig system arguments while starting Siddhi Runner, the corresponding functionalities fail.
Affected Product Version:
Siddhi Runner 5.1.0-m2
Steps to reproduce:
./runner.sh -Dapps=../wso2/SampleApp.siddhi
Description:
Following error is thrown when the hard refresh of the browser
osgi> [2019-03-16 21:16:28,026] ERROR {org.wso2.transport.http.netty.common.Util} - Remote client closed the connection before completing outbound response java.io.IOException: Remote client closed the connection before completing outbound response at org.wso2.transport.http.netty.common.Util.lambda$checkForResponseWriteStatus$9(Util.java:599) at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507) at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:481) at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:420) at io.netty.util.concurrent.DefaultPromise.addListener(DefaultPromise.java:163) at io.netty.channel.DefaultChannelPromise.addListener(DefaultChannelPromise.java:93) at io.netty.channel.DefaultChannelPromise.addListener(DefaultChannelPromise.java:28) at org.wso2.transport.http.netty.common.Util.checkForResponseWriteStatus(Util.java:595) at org.wso2.transport.http.netty.contractimpl.HttpOutboundRespListener.writeOutboundResponseHeaderAndBody(HttpOutboundRespListener.java:187) at org.wso2.transport.http.netty.contractimpl.HttpOutboundRespListener.writeOutboundResponse(HttpOutboundRespListener.java:138) at org.wso2.transport.http.netty.contractimpl.HttpOutboundRespListener.lambda$null$35(HttpOutboundRespListener.java:94) at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463) at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858) at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138) at java.lang.Thread.run(Thread.java:748)
Related Issues:
wso2/product-sp#792
Description:
With Siddhi Tooling distribution v5.1.0-m2, we are only bundling 9 mandatory samples. It would be easier for the user, if we move the samples specific to extensions from samples/artifacts.
Before moving them, we would have to clean the descriptions.
Affected Product Version:
v5.1.0-m2
Description:
Below test cases are failing in Windows OS due to file specific operation. We should fix this ASAP.
[00:47:38] testDeleteFilesApi(io.siddhi.distribution.test.osgi.SimulatorAPITestcase) Time elapsed: 0.044 sec <<< FAILURE!
[00:47:38] java.lang.AssertionError: expected [200] but found [500]
[00:47:38] at org.testng.Assert.fail(Assert.java:94)
[00:47:38] at org.testng.Assert.failNotEquals(Assert.java:496)
[00:47:38] at org.testng.Assert.assertEquals(Assert.java:125)
[00:47:38] at org.testng.Assert.assertEquals(Assert.java:372)
[00:47:38] at org.testng.Assert.assertEquals(Assert.java:382)
[00:47:38] at io.siddhi.distribution.test.osgi.SimulatorAPITestcase.testDeleteFilesApi(SimulatorAPITestcase.java:303)
[00:47:38]
[00:47:38] testListingDirectories(io.siddhi.distribution.test.osgi.SiddhiEditorTestCase) Time elapsed: 0.254 sec <<< FAILURE!
[00:47:38] java.lang.AssertionError: expected [200] but found [500]
[00:47:38] at org.testng.Assert.fail(Assert.java:94)
[00:47:38] at org.testng.Assert.failNotEquals(Assert.java:496)
[00:47:38] at org.testng.Assert.assertEquals(Assert.java:125)
[00:47:38] at org.testng.Assert.assertEquals(Assert.java:372)
[00:47:38] at org.testng.Assert.assertEquals(Assert.java:382)
[00:47:38] at io.siddhi.distribution.test.osgi.SiddhiEditorTestCase.testListingDirectories(SiddhiEditorTestCase.java:139)
[00:47:38]
[00:47:39]
[00:47:39] Results :
[00:47:39]
[00:47:39] Failed tests:
[00:47:39] SiddhiEditorTestCase.testListingDirectories:139 expected [200] but found [500]
[00:47:39] SimulatorAPITestcase.testDeleteFilesApi:303 expected [200] but found [500]
Description:
Zip file created from the K8 export feature has the same name as the file exported through Docker export feature
Affected Product Version:
siddhi-tooling-5.1.0-alpha
Steps to reproduce:
Assignees
@BuddhiWathsala
Description:
Support parameter overloading in editor source view
Affected Product Version:
5.1.0-m1
Related PR:
#188
Description:
$subject...
Because it uses osgi, but I am not familiar with how to build a development environment
Description:
When a Siddhi app is passed through Siddhi Custom resource object for kubernetes deployment, we use Siddhi Parser to get the necessary information extracted from the app for Siddhi Operator to create the deployment.
While parsing the Siddhi app, the parser starts a SiddhiAppRuntime, which causes it to create network connections. If any network connection fails, the parsing fails.
We cannot use the createSandboxSiddhiAppRuntime(), since it removes all network endpoints(source/sinkns,..) which would make the parsing incomplete.
Affected Product Version:
5.1.0-alpha
Description:
Document available endpoints, syntax, sample request and sample response in Siddhi Runner and Tooling.
Description:
$subject
The context for the documentation
Description:
We need to migrate the mandatory samples that demonstrates the capabilities of Siddhi from WSO2 SP-Samples to the distribution.
Description:
View/Hide File Explorer functionality and export to Docker functionality has the same keyboard shortcut
Affected Product Version:
siddhi-tooling-5.1.0-alpha
Description:
$subject. Links point to SP docs and wso2.github.io
Icons and the page name is WSO2 Stream Processor
Description:
Need a proper documentation/article which explains the end to end CI/CD story of Siddhi. This document will be the guide for external users to achieve the complete CI/CD cycle with Siddhi.
Description:
Siddhi Editor needs to be updated with Siddhi logo.
Description:
AFAIK there is no proper mechanism to shutdown the Siddhi runner gracefully. I deployed the Siddhi runner in a K8s cluster. I had a requirement to enable zero downtime to the Siddhi runner deployment in K8s. I deployed the Siddhi app with HTTP source and route traffic using ingress NGINX controller.
Sometimes Siddhi runner terminates and another Siddhi runner deployment starting to consume HTTP requests. The terminating Siddhi runner has already established HTTP connections. But unfortunately, Siddhi runner shut down immediately without a response to the established connections. That leads to a 502 bad gateway in client-side.
In the termination, Siddhi runner gets a SIGTERM from K8s. At that time Siddhi runner should finish all the running threads and then starts to terminate.
Suggested Labels:
type/fix
Affected Product Version:
v5.1.0-m1
Steps to reproduce:
Description:
My <SIDDHI-RUNNER-HOME>/wso2/worker/deployment/siddhi-files/
directory contains some of the system link sub-directories and files such as
drwxr-xr-x 2 root root 4096 Mar 18 10:06 ..2019_03_18_10_06_42.481700548
lrwxrwxrwx 1 root root 31 Mar 18 10:06 ..data -> ..2019_03_18_10_06_42.481700548
apart from my *.siddhi
files. When I tried to deploy siddhi files in the <SIDDHI-RUNNER-HOME>/wso2/worker/deployment/siddhi-files
directory, it tries to deploy that system link sub-directories and files, then gives an error as,
[2019-03-18 10:06:50,993] ERROR {io.siddhi.distribution.core.core.internal.StreamProcessorDeployer} - /home/siddhi-runner-1.0.0-SNAPSHOT/wso2/worker/deployment/siddhi-files/..data (Is a directory) java.io.FileNotFoundException: /home/siddhi-runner-1.0.0-SNAPSHOT/wso2/worker/deployment/siddhi-files/..data (Is a directory)
So, it's better to have the capability to filter only the siddhi files from <SIDDHI-RUNNER-HOME>/wso2/worker/deployment/siddhi-files
directory and deploy, instead of deploying all the files in that directory.
Description:
Need to migrate the editor fixes done in carbon-analytics to the distribution.
Affected Product Version:
5.1.0-m1
Description:
Due to the improvements related to CI/CD of Siddhi, there are some improvements/changes required in the Siddhi editor. Hence, this issue is created to track that.
Description:
No results found error for distributionStrategy under Operators in Siddhi Editor.
If no content for the said operator, we should not show it under the Operators.
Description:
$subject as discussed in https://groups.google.com/forum/?hl=en#!topic/siddhi-dev/j3W4mA8jjQ8
This is to track the namespaces task list for improving the components to align with new configuration structure
Description:
Transport configurations should be documented since as of now there is no clear documentation on them.
Affected Product Version:
v5.1
Description:
Remove unwanted data bridge feature. Data bridge will be no longer used in runner and tooling. Most of the dependencies of data bridge remove from distribution.
That removal will make the final packs lighter.
Affected Product Version:
5.1.0-m2
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.