Code Monkey home page Code Monkey logo

sdk-javascript's People

Contributors

aliok avatar bolt04 avatar craigsdennis avatar danbev avatar deewhyweb avatar dependabot[bot] avatar duglin avatar fabiojose avatar github-actions[bot] avatar grant avatar helio-frota avatar jasonjlock avatar kirusi avatar lance avatar lholmquist avatar loopingz avatar matejvasek avatar micheleangioni avatar onecricketeer avatar pengsrc avatar psanetra avatar rathankumar avatar sidharthv96 avatar snyk-bot avatar sskular-wonderlab avatar tnunamak avatar viraj-s15 avatar w3nl avatar zombispormedio avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

sdk-javascript's Issues

Recommend dropping support for 0.2

In the SDK requirements documentation, it says "Each SDK supports the latest(N), and N-1, major releases of the CloudEvent spec", with a special case being 1.0 as latest with 0.3 recommended support.

Ultimately, dropping support for 0.2 should make maintenance simpler. From npm download numbers, I guess it's hard to tease out how many are still on 0.2, but this seems reasonable.

I wonder also what the versioning strategy is. Is the npm module is versioned independently from the spec version? It seems like it must be - at least for minor and patch levels.

Discussion: Version Branches

Currently the repository is tagged for each prerelease, mirroring the spec versions as they progressed. However, there are no release branches. This was probably fine prior to 1.0. But now that 1.0 is out, a v1.x branch should be created and maintained so that fixes can be backported to previous versions as we continue to move forward with changes in the repository.

I suggest the following:

  • Create a v1.x branch off of the v1.0.0 tag.
  • Create a v1.x-backport label which can be used to label any PRs that have landed since the 1.0.0 release, indicating they should be considered for backporting to the v1.x branch.
  • As appropriate, create patch releases off of the v1.x branch, tagging the repo on the v1.x branch with v1.0.1, for example, for the next patch release.
  • As appropriate, create minor releases off of the v1.x branch, tagging the repo on the v1.x branch with v1.1.0, for example, for the next minor release.
  • If needed create a v1.1.x branch for minor release patch updates.
  • As appropriate, create patch releases off of the v1.1.x branch, tagging that branch with v1.1.1, for example, for the next patch release in this major/minor combination.
  • Repeat for 2.0.0 and up on their own branches.

This is related to the question raised in #72 where the contributor's guide talks about using the develop branch. That hasn't been done since the 1.0.0 release. We need to decide if we want to move forward with this process. If so, then the develop branch needs to be updated with all recent commits, and set as the primary branch in the repository settings. This setting will ensure that by default, pull requests from contributors are created to land on the develop branch.

My personal feeling is that the develop branch is mostly redundant with master and we don't really need to do that. If contributors can manage rebasing their pull requests with updates from master regularly, this shouldn't be a problem. Though I think this does add some pressure to land PRs quickly. What are other's thoughts on this?

Merge dev into master

It's hard to contribute to this library when the dev branch is 50 commits behind master.

Please merge these commits to master before we can contribute. There's not a large reason for there to be separate branches afaik.

Recommend using GPG signed commits vs. --signoff

The --signoff flag for commits is not often used and is primarily a declaration that the committer has actually authored the code or has rights to submit code she has not written. Instead, I think it's better to use cryptographically signed commits which can also be enforced by GitHub (see protected branches in settings). Using GPG signatures on commits is generally a better way to ensure that a commit is coming from the person it appears to be coming from, vs. --signoff which is just a line of text at the bottom of a commit message. By using GPG signed commits, there is much more certainty that the commit is legitimate and coming from the person it appears to be coming from.

Travis is Flaky

Travis looks like it is flaky. I have added some PRs that don't affect the tests, yet Travis seems to fail.

Example:
#74

Ideally a PR like this shouldn't fail.

A linter should be added to the package.json scripts

Because there are no linting rules outlined in the repository, and no task in package.json to lint the code, contributors can't know what the style guidelines/expectations are, and might therefore submit a pull request that will fail the linting checks on Codacy.

Remove Unused .d.ts Types

The TypeScript definitions at v1/index.d.ts are not being used and cannot be used as-is (at least in a normal way). They are not exported in package.json.

They're likely out of date and not going to be aligned with the source code as things change.

I suggest removing them and auto-generating them at a later point in time.

getting error using httpsAgent in http binding emitter

we need to use mTLS authentication to a https endpoint with certificate and key when submitting cloud events.
We got the following error:
"The "options.agent" property must be one of type Agent-like Object, undefined, or false."
when using the following code:
var options = { method: "POST", url: requestUrl, httpsAgent: new https.Agent({ cert: fs.readFileSync(cf_cert), key: fs.readFileSync(cf_key) }) };
var binding = new cloudeventsSDK.StructuredHTTPEmitter(options); binding.emit(ce) .then(response => { callback(null); }).catch(err => { callback(err); });

emitter_structured.js (possibly in other emitters as well) line #6 & line#23 causing the issue, it assumes the configuration is a JSON object, but in the case of httpsAgent, the underlying module requires httpsAgent attribute of type 'Agent-like' object.

image

Don't Push Directly to Master

I don't think we're allowed to push directly to master. It seems like we must get 1 reviewer cross company.

Wanted to create a PR to convert to TS, but the library was changing too often.
See #9
CC @n3wscott

Eliminate up-front spec version awareness in the receiver

Currently, when receiving CloudEvents you have to know what spec version the event is being sent as. But that shouldn't be required. As an event receiver, I should be able to accept events from any spec version supported by the module without prior knowledge of the sender's spec version.

Unmarshalling structured cloudevent with subject fails

Providing a subject key in a structured payload falls into the extension flow. It is then rejected as being a reserved attribute name.

I did not open a PR yet, I wanted to ensure this was not intended behavior. Failing test shown below.

https://github.com/cloudevents/sdk-javascript/compare/master...jabrown85:jab/subject-unmarshaller?expand=1

227 passing (226ms)
  1 failing

  1) HTTP Transport Binding Unmarshaller for CloudEvents v0.3
       Structured
         Should accept event that follow the spec 0.3:
     Reserved attribute name: 'subject'

How to emit event to Knative broker?

I have a Knative service which receives a CloudEvent from a Knative source. After processing this event I want to emit another CloudEvent to my Knative Eventing broker. Unfortunately, I'm having a hard time achieving this:

const binding = new CloudEvent.bindings["http-binary0.2"]({
    method: "POST",
    url: "http://my-broker.default.svc.cluster.local/"
});

This configuration leads to communication timeouts. Any help on this is highly appreciated 🙂

Contributor guide is out of date or not enforced

The contributor's guide for this repository is out of date and should be updated, or the recommendations there should be enforced. For example, the develop branch is out of date, and we're not following the gitflow recommendation of pull requests submitted to develop. Personally, I don't mind master being the tip as long as fixes can be backported to version branches.

If using a develop branch is important for this project, then I think the develop branch should be set as the default branch in GitHub and GH Actions should be used to enforce these processes: https://help.github.com/en/github/administering-a-repository/defining-the-mergeability-of-pull-requests

There should also be an update to the section on managing the changelog when this lands: #68

File upload example

As I understand, when uploading a file over the HTTP, the data of the event should contain a stream. So a NodeJS example would look like this?

// http1
http.createServer((req, res) => {
  const event = { ...read event metadata from headers... };
  event.data = req; // attach data stream
  ...add logic...
  res.end({ ...respond with CE metadata... });
});
// http2
const server = http2.createServer();
server.on('stream', (stream, headers, flags) => {
  const event = { ...read event metadata from headers... };
  event.data = stream; // attach data stream
  ...add logic...
  stream.end({ ...respond with CE metadata... });
});

Is this correct?

Discussion: Codify How Pull Requests are Landed

Currently there are only a handful of folks in the @cloudevents/sdk-maintainers group who are actively participating in this repository. But as other contributors come on board, it might be good to codify how pull requests are landed.

For example, can a contributor with committer rights submit, approve and land their own PR all within a single day? A single hour? At the risk of overkill, here's a strawman proposal.

  • No pull request may land without passing all automated checks
  • All maintainers can land pull requests from outside contributors 24 hours after they have been submitted, given that it has one approval (that approval can be from the person landing the PR)
  • If a maintainer has submitted a pull request and it has approval from one other maintainer, it can be landed immediately
  • If a maintainer has submitted a pull request and it has not received approval from at least one other maintainer, it can be landed after 48 hours
  • If an outside contributor submits a pull request that is very minor - e.g. fixing a typo in a doc - the 24 hour cooling period doesn't apply
  • If a pull request has both approvals and requested changes, it can't be landed until those requested changes are resolved.

Thoughts?

Change CI config to only build master, version branches and pull requests

Currently CI is configured to build all branches. We should consider modifying the configuration so that only master and perhaps develop, plus version branches are built. In general, I think feature/fix branches should be on the contributor's fork of the repository to avoid this problem and to prevent branch proliferation on origin. But that's not always the case - and when a feature/fix branch on origin fails CI, the badge on the README.md shows as failing. This makes sense if version branches are failing, but not if ongoing work on a feature/fix branch is.

So... I guess I have two recommendations.

  1. Modify CI to only build master (maybe develop) and version branches
  2. Discourage feature/fix branches on origin

Error when installing Express example

$ npm install
npm ERR! code 1
npm ERR! Command failed: git checkout v1
npm ERR! error: pathspec 'v1' did not match any file(s) known to git
npm ERR! 

I'm going to send a PR to fix and align that with the same approach used by TypeScript example.

HTTP binding for binary format

The HTTP binding for binary formatted data currently does not follow the cloudevents/spec:

  1. The event type, time, and schema url should not be required
  2. the content-type should be json, but not cloudevents-json

Additionally, the config object should be copied because creating multiple bindings with the same config will override the content-type.

Fixed in issue #14.

Special handling of the "data" attribute not supported

In the v0.3 spec for JSON format section about Special Handling of the "data" Attribute, it says:

First, if an implementation determines that the type of the data attribute is Binary or String, it MUST inspect the datacontenttype attribute to determine whether it is indicated that the data value contains JSON data.
If the datacontenttype value is either "application/json" or any media type with a structured +json suffix, the implementation MUST translate the data attribute value into a JSON value, and set the data attribute of the envelope JSON object to this JSON value.

If I'm reading the spec correctly, it looks like the SDK should serialize stringified JSON in the data attribute. In other words, I would expect this to work:

const v03 = require("cloudevents-sdk/v03");
const data = { "so": "scare" };
const ce = v03.event
  .id('4e4cfbd6-d542-4235-9e80-99d2df62abee')
  .source('com.example.whatever')
  .type('something')
  .dataContentType("application/json")
  .data(JSON.stringify(data)) // <-- stringified json data goes in
  .format();

const ceData = ce.getData();
console.log(ceData.so); // <-- this breaks, getData returned the stringified JSON, not an object.

I wrote a sample test (joshwlewis@19cc9f3), if it's more useful than my untested example. It fails with:

  CloudEvents Spec v0.3
    The Constraints check
      'data'
        1) should convert data with stringified json to a json object


  0 passing (8ms)
  1 failing

  1) CloudEvents Spec v0.3
       The Constraints check
         'data'
           should convert data with stringified json to a json object:
     AssertionError: expected '{"much":"wow"}' to equal { much: 'wow' }
      at Context.it (test/spec_0_3_tests.js:209:41)

Dependencies are outdated and contain vulnerabilities

There are a number of vulnerabilities identified when running a clean npm install. There are also a number of dependencies specified in package.json that are unused. And finally, a couple of the dependencies are no longer maintained.

See: #49

Don't use AJV. Use JSON.

This SDK uses a library called AJV for schema validation. However, it makes creating and using CloudEvent objects very complicated through a set of setters and getters.

It would be a lot simpler to be able to directly create CloudEvent objects given a JSON object. Further field validation could be done externally through a different function rather than a ajv.compile(schema) and external spec.

Add grant as collaborator

Hello, I'd like to become a copaborator for this repo as I'm working heavily on events at Google.
I was requested to create an issue, so here it is!

Tests are not run for pull requests

The .travis.yml file is configured so that the code coverage results are only published when the commit is not a pull request. However, as written, tests are not run at all for pull requests.

develop branch and npm package behind

There are 2 PRs that are already merged.

PR #10 Changed JSON schema validator's options to remove warning: "$ref: key…

  • is in the develop branch, not in master or released in npm registry

PR #8 Added missing 'source' getter on Spec02

  • is in master branch but not released on npm registry

Can this 2 fixes be promoted to the npm registry?

Also trunk based development working on master branch seems to be more productive and adopted this days, why not just have 1 branch master?

Error handling response

I have a question regarding error handling of events. Standard dictates that you always return the event parameters(e.g. headers) in case of success what about in case of an error? In your example you do not return them but just respond with status 415.

And how to handle a batch of events if only one fails or n fail.

$ref: keywords ignored in schema at path "#"

Hi,
just requiring the sdk with a simple
const Cloudevent = require('cloudevents-sdk');

causes the following error to be shown in the console:

$ref: keywords ignored in schema at path "#"

Any suggestion or comment?

Invalid content-type validation for binary format

With the following binary message:

{ accept: 'application/json, text/plain, */*',
  'content-type': 'application/json; charset=utf-8',
  'ce-type': 'com.github.pull.create',
  'ce-specversion': '0.2',
  'ce-source': 'urn:event:from:myapi/resourse/123',
  'ce-id': '74500b1a-4f87-43d1-afa9-7fe1c6251da8',
  'user-agent': 'axios/0.18.0',
  'content-length': '15',
  host: 'localhost:3000',
  connection: 'close' }
{"dummy":"wow"}

We got:

{ message: 'invalid content type',
  errors: [ 'application/json; charset=utf-8' ] }

Question: To separate constants by meaning/group ?

For example STRUCTURED_ATTRS_1 is used only inside received_structured_1.js.
But it requires all the other constants inside: require("./constants.js");.

Is it worth separating constants in different files? Is there a possibility that the file of constants will grow even more?

Use Direct Object Creation Pattern, Do Not Use Builder Pattern

The JavaScript SDK uses a Java-like builder pattern for creating events.

This is not ideal as it is not very JavaScript-like, and is unnecessarily wordy. As we are in JavaScript, we can use direct JSON objects for creating the CloudEvent payload.

Here is a sample taken from the README:

Actual

let myevent = event()
  .source('/source')
  .type('type')
  .dataContentType('text/plain')
  .dataschema('http://d.schema.com/my.json')
  .subject('cha.json')
  .data('my-data')
  .addExtension("my-ext", "0x600");

Expected

import {CloudEvent} from 'cloudevents/v1';
let myevent = new CloudEvent({
  source: '/source',
  type: 'type',
  dataContentType: 'text/plain',
  dataschema: 'http://d.schema.com/my.json',
  subject: 'cha.json',
  data: 'my-data',
  addExtension: ["my-ext", "0x600"]
});

Support browser usage

JavaScript is ubiquitous in the browser and the CloudEvents module should be easy to consume in that environment. A good example is the sockeye cloud event viewer. This little chunk of code would be much improved if the CloudEvent module could be used.

Immutable event builder

Implements an immutable event builder, to builds events instances that are unchangeable after its build.

Depends on #28

A Cloudevent class per spec version

Today we have just one cloudevent.js that makes the interface with Spec classes. This is ok with vanilla js, because of flexibility which exists in js programming we inject new methods we needed.

Now it's time to stabilize the things. We are reaching the v1.0 and some attribute names are getting stable, in special the required ones: id, source, specversion and type.

We need:

  • a base class with required attributes
  • a class that extends the base one and makes specializations with it's own attributes
  • every implementation MUST follow the immutable object philosophy

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.