Code Monkey home page Code Monkey logo

Comments (6)

gireeshpunathil avatar gireeshpunathil commented on September 27, 2024

The client expects the standard HTTP protocol header plus the message: HTTP/1.1 200 OK ... \r\n\r\n and when the server ends the response, a 0\r\n\r\n as well.
When the server writes the message and ends the response, (two step process) these two messages are dispatched to the client socket.

In linux, client receives them together. In AIX, the first dispatch is instantly received, so there are two discrete receptions.

The client's connection.on('data') call back is called twice or once, depending on how many messages are received.

This is purely a timing issue, and depends on how much delay the client incurs while reading from the socket, as many messages would have been dumped in the stream, and it will get a de-fragmented message string.

I am able to recreate the failure in linux as well, using artificial delay between server responses.

As the response is two parts, and they together form part of the HTTP protocol, it is unfair for the client to expect their reception mode - either together or separate.

In short, I believe this is an invalid test case. Keeping this issue opened for some discussion.

from node.

mtbrandy avatar mtbrandy commented on September 27, 2024

This issue also affects test/simple/test-http-default-encoding.js and test/simple/test-http-request-end.js. Because of some changes in node between 0.10.25 and 0.11.10, it results in testcase failure on the latter release.

I have produced a simple test that demonstrates the difference in behavior between AIX/pLinux and am now seeking advice on how to fix or work around the problem.

from node.

mtbrandy avatar mtbrandy commented on September 27, 2024

Received the following two suggestions:

  1. Enable the tcp_fastlo tunable.
$ no -o tcp_fastlo=1

From the manpage:

The transmission control protocol (TCP) fastpath loopback option is used to achieve
better performance for the loopback traffic.

The tcp_fastlo network tunable parameter permits the TCP loopback traffic to reduce
the distance for the entire TCP/IP stack (protocol and interface) to achieve better
performance.

The application does not require any changes when using this option. When enabled,
the TCP loopback traffic is handled similarly to the UNIX domain implementation.
  1. Enable the tcp_nodelayack tunable.
$ no -o tcp_nodelayack=1

From the manpage:

The tcp_nodelayack option prompts TCP to send an immediate acknowledgement,
rather than the usual 200 ms delay. Sending an immediate acknowledgement might
add a little more overhead, but in some cases, greatly improves performance.

Performance problems have been seen when TCP delays sending an acknowledgement
for 200 ms, because the sender is waiting on an acknowledgment from the receiver and
the receiver is waiting on more data from the sender. This might result in low streaming
throughput. If you suspect this problem, you should enable the tcp_nodelayack option to
see if it improves the streaming performance. If it does not, disable the tcp_nodelayack
option.

Note that there is a -p option to the no command to make non-default settings persistent across system reboots.

from node.

mtbrandy avatar mtbrandy commented on September 27, 2024

The tests mentioned in this issue no longer fail after toggling the above tunables.

from node.

mtbrandy avatar mtbrandy commented on September 27, 2024

Enabling these tunables had the side-effect of causing another test (test-dgram-pingpong.js) to timeout due to degraded performance.

We've since tried different permutations of these tunables and the test results suggest the following settings: tcp_nodelayack=1 and tcp_fastlo=0 (the default).

from node.

gireeshpunathil avatar gireeshpunathil commented on September 27, 2024

Closing this issue as this is no more a concern. Will re-open if this issue surfaces again.

from node.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.