Code Monkey home page Code Monkey logo

Comments (17)

rixo avatar rixo commented on May 12, 2024

I've encountered the same thing with the current version. Requests with bodies fail to be proxied (see my PR for some details).

As a temporary solution, you can try to use this branch in my fork (for example, npm install rixo/dyson#fix-proxy-body). It's vanilla dyson except for the fix in question, so you'll be able to switch back easily once the issue has been fixed here.

from dyson.

webpro avatar webpro commented on May 12, 2024

Hi @pjacekm, I would be interested in your feedback; does the latest release of dyson fix the issues for you?

from dyson.

pjacekm avatar pjacekm commented on May 12, 2024

Hi @webpro, thank you for your response. I have uninstalled the dyson package and then installed it again, effectively upgrading to version 0.7.0. Unfortunately, after upgrade it's even worse, all POST operations are failing with 500 error, even the ones that were working well before upgrading.

from dyson.

webpro avatar webpro commented on May 12, 2024

That's unfortunate, @pjacekm. Any chance you could set up a case so I can investigate it further?

from dyson.

pjacekm avatar pjacekm commented on May 12, 2024

I tried to investigate it further. The remote API is a third-party software and it'll be difficult for me to obtain their logs or meaningful feedback. I prepared 2 tests:

  • First one redirecting the call to a local environment, a simple page created in ColdFusion for dumping request data received from the calling application through Dyson
  • Second - calling the remote API directly

I must admit that I'm not a node developer, I hope I didn't commit any basic errors :) Let me describe my environment and all data I encounter relevant:

I'm running node v. 4.2.2 on Mac OS "El Capitan". I've installed Dyson globally.

Installed modules:
$npm -g ls --depth=0
/usr/local/lib
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected]
└── [email protected]

I've configured two local hosts in /etc/hosts file. The first one is for Dyson, the second one is where Dyson should proxy the requests for a local test:

127.0.0.1           dyson.test.local
127.0.0.1           wrapper.test.local

Example data sent from HTTPRequester Firefox plugin is the following:

Request: POST http://dyson.test.local:3000/api/oauth/token
Content Type: application/x-www-form-urlencoded
Content body: client_id={...id...}&grant_type=password&username={...user...}&password={...pass...}

"dyson.json" contains the following configuration:

{
"proxy" : true,
"proxyHost" : "http://wrapper.test.local",
"proxyPort" : "80"
}

/post/test.js contains the following configuration:

module.exports = {
    path: '/api/oauth/token',
    proxy: true, 
    cache: false
};

Under "wrapper.test.local" the data is received as follows:

Content: client_id={...id...}&grant_type=password&username={...user...}&password={...pass...}
Headers: 
   accept   text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 
   content-type     application/x-www-form-urlencoded; charset=UTF-8 
   dnt  1 
   host     dyson.test.local:3000 
   accept-encoding  gzip, deflate
   accept-language  en-US,en;q=0.5
   cache-control    no-cache
   connection   keep-alive
   content-length   103 
method  POST
protocol    HTTP/1.1 

Changing "dyson.json" to proxy requests to the remote API (instead of the local environment) results in 500 Internal Server Errors:

{
"proxy" : true,
"proxyHost" : "https://{real_API_url}",
"proxyPort" : "443"
}

When I'm using the request data as described in my initial post, after upgrading the Dyson package the news is that the request body is no longer an empty binary, however the remote API still responds with 500 errors. When I'm sending the same data to their API directly (without Dyson's intermediation) using the same client (HTTPRequester plugin), the operation is completed without errors.

I'd appreciate your suggestions for other tests or debugging tools I might use to be able to solve the problem.

from dyson.

rixo avatar rixo commented on May 12, 2024

Hi @pjacekm, if I understand correctly:

  • POST requests are successfully proxied to your local environment (whereas before version 0.7 there were hanging with empty body)
  • identical requests made to the actual (remote) endpoint failed to be proxied

You say the remote API respond with error 500; I think what you mean by that is that you get the following message in the dyson logs: "500 INTERNAL SERVER ERROR". Is that correct?

In that case, it might be very informative if you could edit the file lib/proxy.js in dyson, and log the actual error at line 30 with console.log(error) (or console.log(error.message, error.stack) if the previous one is hard to read).

Apparently your actual endpoint that does not work is HTTPS while your local endpoint that works is HTTP? That may be a lead.

from dyson.

pjacekm avatar pjacekm commented on May 12, 2024

Hi @rixo, your assumptions were correct, and your suggestions were absolutely enlightening! I added console.log(error) in lib/proxy.js and indeed there was a problem with the certificate:

Proxying /api/oauth/token to https://{...host...}:443/api/oauth/token
500 INTERNAL SERVER ERROR: https://{...host...}:443/api/oauth/token
{ [Error: Hostname/IP doesn't match certificate's altnames: (...)

Interesting thing, when I changed http://dyson.test.local:3000 to a domain that matched one of the certificate altnames (it was defined as *.theirdomain.com), I started to receive 404 errors from their nginx instance. Calling their URL directly, with the same data resulted in a correct transaction. What a bummer! After some testing and debugging (I added console.log(req.headers) to lib/proxy.js) I found out that something was adding a "Host" header to the request:

host: 'dyson.{...host...}.com:3000'

After defining a custom "Host" header in my request (I understand this is a malpractice?) the requests that I tested so far (more awaiting) went well.

Is Dyson (or any of the libraries it uses) adding this Host header? I tested in two applications so far (Firefox plugin called HTTPRequester and DHC Restlet Chrome plugin) and in both cases "Host" header is automatically added. In the former app I can overwrite the "Host" header manually, but the latter doesn't let me to.
It's not an ideal scenario though, because I will use those plugins only in the initial phase, then we need to test our real application without any changes in its code. However, introducing changes in the wrapper (= Dyson) would be OK. Adding a special header in our application would not meet our requirement of unchanged code that is tested... Unless I can hardcode my own "Host" header in Dyson config files for "POST"? - that would be a solution I could live with ;)

from dyson.

rixo avatar rixo commented on May 12, 2024

So it seems that it's our side that is rejecting their certificate. Probably because it is self-signed?

To confirm this diagnostic, you can try to add rejectUnauthorized: false to the proxy request's options, and it should work. As noted everywhere, this option is not production-grade, but I guess mock servers are not often used in production...

from dyson.

pjacekm avatar pjacekm commented on May 12, 2024

Their certificate doesn't look like self-signed to me, according to the "Issuer" info it was issued by DigiCert Inc. I added rejectUnauthorized: false option as you suggested, and indeed 500 errors are gone. I will leave this setting there, as you said it's not a production server, so not a big deal.

Anyway, I'm still getting 404 errors when the Host header does not specify their exact domain. It's possible that their nginx proxy configuration is too restrictive. I have modified our copy of Dyson code, now we can add requestHeaders option in the resource configuration files, like in the following example:

module.exports = {
    path: '/api/oauth/token',
    proxy: true, 
    requestHeaders: {
        'host':'{...host...}',
        'X-My-Always-Added-Header':'value'
    }
};

I have modified defaults.js too. This option is "transported" in util.js (similar to options), and then used in proxy.js:

var requestHeaders=util.configs.get(req.method.toLowerCase(), req.url).requestHeaders;
for(var h in requestHeaders){
     req.headers[h]=requestHeaders[h];
}

Probably not the most elegant solution, but the general idea serves our purpose.

I still don't know how to match URLs with dynamic params, like /api/Upload/Chunked/Files/:id, with the new module.exports "requestHeaders" option I added. If you know how this could be accomplished in node.js and Express, I'd certainly appreciate a tip :)

Thank you very much for your help and great work on this useful project!

from dyson.

pjacekm avatar pjacekm commented on May 12, 2024

As a follow-up... I'm trying to match the :id parameter in proxy.js. It is defined in resource JS file:

module.exports = {
    path: '/api/Upload/Chunked/Files/:id',

When I use console.log(req.params) in proxy.js, I'm getting:

{ '0': '/api/Upload/Chunked/Files/deeea785-073c-492a-8f65-c006552d7fd1' }

instead of req.params.id as per documentation.

Is it an issue with the Dyson package, or that's how it works and there is no way to map :id to deeea785-073c-492a-8f65-c006552d7fd1 in this particular case?

from dyson.

rixo avatar rixo commented on May 12, 2024

Thanks for all you feedback. I think it eventually all come together. The Host header is added by the party making the request, and is mandatory according to HTTP specs. The problem is that currently we are just indiscriminately sending all the original request headers (in which Host point to the dyson server) to the target endpoint. This causes two issues. On our side, node's https module chokes when it verifies the certificate identity because (I think) it relies on the Host header to determine who we think we are talking to. On the other end, the Host header is needed by the reverse proxy (their nginx) to determine the actual target server.

In the end, you were right: it all comes down to fixing the Host header. It can be done without the hassle to set it manually though. Simply replace this line by this:

headers: require('lodash').omit(req.headers, ['host'])

The request module should add the correct header itself. As it should fix everything, can you please try this without the rejectUnauthorized: false option and confirm it works?

I don't know if you will still need to add custom headers to the request to the target endpoint, but for the record... Proxied requests are not registered into express with their own route by dyson, but with '*'; that's why your params are not mapped. You can change this by adding the following line in this place:

app[method](config.path, proxy.middleware);

If you still need to access the dyson resource's config in proxy.js, you can use this version instead:

app[method](config.path, function(req, res) {
    req.dysonResourceConfig = config;
    proxy.middleware(req, res);
});

That will save you from having to look it up, which is fragile since express is totally OK with having multiple handlers matching the same route... Also, I think you should rename your option requestHeaders to proxyHeaders since there are lots of request types involved around here.

Finally, if you still need this option for something else than fixing the Host header, you should open another issue to request the feature with a bit of context about the use case in it. Hopefully this one will soon be finished.

from dyson.

pjacekm avatar pjacekm commented on May 12, 2024

Hi @rixo, it makes a lot of sense now. The 500 errors were caused by node trying to verify the certificate, and the 404 errors were caused by a reverse proxy in front of the remote API. The point of intersection was a "Host" header, proxying values from the calling app to the remote API.

I have removed rejectUnauthorized: false and put headers: require('lodash').omit(req.headers, ['host']) as you asked, and indeed it fixed all the issues. I've tried host names defined in /etc/hosts file, and bare IP numbers for Dyson URLs, different ports, different methods (GET, POST, PUT), with and without binary content body, and all requests were proxied smoothly. I think custom headers are no longer needed at this point, unless we run into the same problem again with another remote API :)

I'd like to ask you one more question before closing the issue. Is it possible to create such configuration that would randomly allow part of the requests to the same resource get proxied, while some of them would generate mocked (locally defined) response? For example, let's say if the same resource URL GET /api/Clients/ is requested 3 times, it gets proxied to the remote API twice, and one time Dyson generates mocked response from the template or callback: functionName? I'd like to proxy most of the requests to the remote API, and generate ugly errors once in a while.
The reason for such configuration is, that the remote API we're integrating with is quite complex, and all calls require authentication header (OAUTH). Therefore I can't generate the auth header locally and then proxy other requests to the remote API, it's either all proxied, or all local, and generating API responses locally would be a big headache, due to it's complexity and size. Being able to "inject" false errors occasionally would let us test our code better.

Thanks again for your help!

from dyson.

webpro avatar webpro commented on May 12, 2024

Thanks for a great discussion, guys. Will give it a more thorough read soon, but for now I wanted to respond to the "random proxy" question.

Currently this is not possible with dyson. The paths that should be proxied are not registered with dyson at boot time. Once we would refactor this decision (to proxy the request or not) to (just before) the render process, the requested feature becomes an option. Perhaps simply by allowing the proxy value to be true, false or a number between e.g. 0 and 1 that represents the likeliness the request will be proxied.

from dyson.

pjacekm avatar pjacekm commented on May 12, 2024

Thank you @webpro for your response. Actually, I liked the idea very much, it would give the user the option to decide whether he wants to proxy all requests, some of them (with the likeliness number) or none of them.

In my opinion, such setting should be available at the individual resource config level, and the probability number could be an integer, say between 1 and 10, just to avoid playing with decimals and make this setting more clear to the users.

Just my 2¢ ;)

from dyson.

pjacekm avatar pjacekm commented on May 12, 2024

Is there a way to generate timed responses in Dyson? I'd like to simulate a delay in responses, but after 2+ hours of googling I still can't figure it out.

setTimeout(function(){
     res.status(400).send({"error": "invalid_clientId", "error_description": "client_id is not set"});
}, 10000)

doesn't work, the response is send immediately, and after 10 seconds an error appears in console:

Error: Can't set headers after they are sent.

Edit: after some more testing I've seen that putting next(); inside setTimeout() callback function does the trick. I prefer to ask Node.js masters anyway :)

Also, I'd like to be able to simulate connection errors, for example "ECONNRESET" - is there a way to accomplish it in Dyson? I was trying:

req.pause();
req.connection.destroy();

but it works really strange. When invoked the first time, it's ok, but when called more times, the request gets executed twice. Might be a wrong location of next(); too... see full gist

I know such questions should be posted on some mailing list, but I think Dyson project doesn't have one?

from dyson.

webpro avatar webpro commented on May 12, 2024

@pjacekm Please open new issues for new questions/topics. I'm actually the only creator/maintainer of this project, but I've been getting some great help from people like @rixo. On to your questions:

  1. I've been meaning to implement timed/delayed responses for a long time in dyson. However, the need for this feature decreased a lot when it became easy to throttle responses with Chrome devtools (see screenshot).
  2. Maybe you can use the status property to simulate server-side errors?

screen shot 2015-11-13 at 9 04 24 am

from dyson.

pjacekm avatar pjacekm commented on May 12, 2024

@webpro Thanks for your comments. My remarks follow:

  1. We need to test compiled applications that consume RESTful APIs directly, they are running inside a Web browser. I'm not sure if Chrome devtools can be useful in this scenario.
  2. We need to simulate situations where there's network latency or connection issues, and no response is generated server side, or even connection is lost. It's difficult to disconnect network cable from a server :)

I'll open a few new issues soon, let's see what happens. Thanks again for your help!

from dyson.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.