Code Monkey home page Code Monkey logo

Comments (14)

tjanczuk avatar tjanczuk commented on September 28, 2024

The use of JSON serializer to help marshal data from .NET to node.js is temporary. Going forward it will be removed completely in favor of direct type marshaling between CLR and V8, similarly to how marshaling from node.js to CLR is implemented currently. Given that exposing such abstraction at this point is not desired.

from edge.

mythz avatar mythz commented on September 28, 2024

ok cool sounds good.

from edge.

mythz avatar mythz commented on September 28, 2024

Actually I'm re-opening this so you can let us know when you've removed the JSON Serializer in favor of the direct type marshaling proposed.

Any guess of an ETA on this?

from edge.

tjanczuk avatar tjanczuk commented on September 28, 2024

Do you have a specific problem with the use of JavaScriptSerializer? You mentioned performance before, is this the motivation? I think I'd rather have some perf benchmarks first to be able to quantify the effect of any changes in this space. Perf bechmarks are covered by #27.

Also, pull requests welcome ;)

from edge.

mythz avatar mythz commented on September 28, 2024

Yes performance of JSS sucks, I've had to pull it out of being measured Northwind Benchmarks because it was unfeasible to wait for them to run them with any high number of N.

Basically I want to marshal the entire request and response through to a ServiceStack back-end to handle and I prefer to start work once the final solution is in place rather than before it gets changed again.

I think you're the best person to implement the ideal marshaling as you would know the most optimal way to do it better than anyone else.

from edge.

tjanczuk avatar tjanczuk commented on September 28, 2024

Can you contribute some benchmarks based on your payload profile so that I can have something real to measure against?

from edge.

mythz avatar mythz commented on September 28, 2024

Benchmarks with different JSON Serializers in #edgejs? Are the JSON Serializers swappable atm?

from edge.

tjanczuk avatar tjanczuk commented on September 28, 2024

I mean as part of #27, can you add a measurement of how long it takes to marshal your sample playload from JS to .NET (or back). Run it in a loop 10000 times, average the time kind of thing. Once we have that benchmark we will be able to assess the effect of changes I making in code on your scenario.

from edge.

mythz avatar mythz commented on September 28, 2024

okie cool, I'll see what I can stitch together.

from edge.

tjanczuk avatar tjanczuk commented on September 28, 2024

I've started some work in the perf branch. Here is the first test: https://github.com/tjanczuk/edge/blob/perf/performance/marshal_clr2v8.js. It measures the latency of returning simple object from .NET to node.js and also provides a baseline of doing the same purely in node.js. The baseline is there not because this is a scenario to compare to (a proper baseline would be to make a cross-process call instead of a call from node.js to CLR in-process). But having the baseline allows to assess latency in relative terms and compare performance results across machines in a meaningful way.

So in my case it appears calling .NET from node.js is currently about 23x slower than calling node.js from node.js.

C:\projects\edge\performance>node marshal_clr2v8.js clr2v8
{ rss: 44965888,
  heapTotal: 12504832,
  heapUsed: 3033500,
  latency: 0.09741 }

C:\projects\edge\performance>node marshal_clr2v8.js baseline
{ rss: 14454784,
  heapTotal: 12504832,
  heapUsed: 1508136,
  latency: 0.00429 }

Now, let's get to work.

from edge.

mythz avatar mythz commented on September 28, 2024

brilliant, thx for the info, i'll try put some benchmarks together on the weekend as well.

from edge.

tjanczuk avatar tjanczuk commented on September 28, 2024

With b857ca2 the dependency on JSON serialization for marshaling from CLR to V8 is removed. Out of the gate this gives 25% improvement.

C:\projects\edge\performance>node marshal_clr2v8.js clr2v8
{ rss: 41701376,
  heapTotal: 12504832,
  heapUsed: 2389464,
  latency: 0.07374 }

Time to break into profiler.

from edge.

tjanczuk avatar tjanczuk commented on September 28, 2024

Here is one low hanging fruit:

perf1

Making support for ScriptIgnoreAttribute optional (set EDGE_ENABLE_SCRIPTSUPPORTATTRIBUTE=1 if you want it) improves perf by 47%, and 60% cumulative compared to the starting point. Fix in 1e81d94.

C:\projects\edge\performance>node marshal_clr2v8.js clr2v8
{ rss: 41496576,
  heapTotal: 12504832,
  heapUsed: 2367296,
  latency: 0.03906 }

from edge.

tjanczuk avatar tjanczuk commented on September 28, 2024

I am going to close the issue and open a new one with the targeted investigaton that remains.

After all these changes, marshaling from CLR to V8 takes 36% of the process time, inclusive. There is no obvious low hanging fruit any more, except marshaling from byte[] to Buffer, which currently allocates memory to copy the data. I will open a new issue to investigate if this clone can be avoided. It would shave off another ~17% of time.

perf3

perf2

from edge.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.