Code Monkey home page Code Monkey logo

json-api's Issues

Example error

I recently discovered this repo after attempting to roll my own. Thank you for your hard work on this, I've learned first hand that it's not as easy as it looks.

As I attempted to use your example, I discovered an undefined variable.

var registry = new API.ResourceTypeRegistry();
var controller = new API.controllers.Base(Registry);
var adapter = new API.adapters.Mongoose(models);

Is 'Registry' defined somewhere? My linter is throwing an error there.

If I change 'Registry' to 'registry' (as I assume it should be), I then get the following error when pinging the endpoint:

{"error":"TypeError: Cannot call method 'adapter' of undefined\n at prototype.GET (/node_modules/json-api/build/lib/controllers/Base.js:24:31)

Again, I totally appreciate your efforts here. Thanks.

Integrate with Koa?

I've been hearing a lot of good things about Koa, written by the team behind Express. It doesn't seem to have a json-api serializer yet, from what I can find. Have you thought about whether this project could also be used with Koa, in addition to Express? Or if not that, have you thought about writing a serializer similar to this one for Koa?

Delete Response has false Content-Type

I'm doing some testing with supertest:

request(app)
        .delete '/some/' + someId.toString()
        .set headers
        .expect 'Content-Type', headers['Accept']
        .expect 204
        .end done

headers =
        'Content-Type' : 'application/vnd.api+json; charset=utf-8'
        'Accept' : 'application/vnd.api+json; charset=utf-8; supported-ext=bulk'

I get the following error:

Error: expected "Content-Type" of "application/vnd.api+json; charset=utf-8; supported-ext=bulk", got "application/vnd.api+json; supported-ext="bulk""

GET, POST and PATCH work very well

Make it easier to mark a field as readonly

I see two meaningful types of readonly fields:

  1. "final" fields, which are fields that, once they're set (on the initial POST), are then unmodifiable (i.e. on PATCH). An example might be a username: settable to anything at first, but then not mutable.
  2. "server-managed" fields, which are fields that are utterly unsettable through the API (i.e. on POST or PATCH), even on initial creation. This is stuff like: created-date, modified-date, and id (when server-generated ids are in use).

Right now, if the user wants one of their API fields to be readonly, whether final or server-managed, they have to manually add logic in the beforeSave transform in the resource description, to 403 if the client attempts to change one of these readonly fields. (For the final case, they even have to inspect the request to check the method.)

But this use case is common enough that there should be a declarative solution. That is, the user should be able to simple set a "server-managed" or "final" key to true in the resource description somewhere, and then a transform would automatically be added to handle the validation.

One natural place to put this key would be in the validation key that exists under each field in the info section. My issue with putting it here, though, is that it's not intuitive that values under info could actually affect the behavior of the app. So, instead, I'm thinking that resource type descriptions could get a "behaviors" key, structured like so:

{
  "behaviors": {
    "fields": {
      "created-at": {
        "server-managed": true
      }
  }
}

Adding this info under behaviors would then also need to be taken into account by the DocumentationController, to add it to the auto-generated docs.

Restify integration

The Restify framework shares syntax with Express with a few minor variations. Is Restify currently supported, and if not what would be required to ensure integration?

Throw an error if model not found

Just a nicety: at the moment, when the adapter's find method gets called, if the type that it tries to find was never provided to the adapter, rather than erroring, it just hangs indefinitely. (I stupidly forgot to add a new model to my new API.dbAdapters.MongooseAdapter({...}) call, and it took me forever to figure out what was going wrong.)

Errors in beforeSave and beforeRender hooks cause hangs

beforeRender: function() {
    notAFunction()
}

Related: I can't figure out how I'm supposed to handle errors there. You mentioned something about the APIError type, but throwing or returning that causes hangs too. How do I reject a request entirely?

Handle meta

Pass off incoming meta to some user function to store/whatever, and give the user some function through which to set meta info on the response should they wish. This would also presumably live in the resource type description, though I haven't thought about its design beyond that.

When both ends of a relationship are of the same type, relationships are returned as attributes

For example, if I have the following schema

var PersonSchema = new Schema({
    parent: {
        type: Schema.ObjectId,
        rel: 'Person'
    }
})

mongoose.model('Person', PersonSchema)

and I GET request it, I get something like

{
    "data": [
        {
            "id": "1",
            "type": "person",
            "attributes": {
                "parent": "2"
            },
            "relationships": {}    
        }
    ]
}

instead of

{
    "data": [
        {
            "id": "1",
            "type": "person",
            "attributes": {},
            "relationships": {
                "parent": {
                    "id": "2",
                    "type": "person"
                }
            }    
        }
    ]
}

Handling PUT requests

I'm trying to get this package working with Ember Data. Ember Data is generating PUT requests, but I can't figure out how to respond to them. Is there an example somewhere?

Custom query parameter support

Are there any plans to support custom query parameters. The standard allows custom query parameters.

In my particular case I have a custom dbAdapter and want to pass locale parameters to the adapter (e.g. /chapter/12345?Locale=en_us).

Does not support objects in an array

It would appear that json-api does not support arrays with objects inside, take the following schema:

var discussionSchema = new Mongoose.Schema({
  messageCount: Number,
  lastIndex: Number,
  message: [{
    index: { type: Number, required: true },
    parentMessage: Number,
    author: { type: ObjectId, ref: 'User', required: true },
    text: { type: String, required: true, validate: Validators.isLength(2, 1024) },
    dateCreated: { type: Date, default: Date.now }
  }]
});

It errors with the following stack:

TypeError: Cannot read property 'name' of undefined
    at getFieldType (/node_modules/json-api/build/src/db-adapters/Mongoose/MongooseAdapter.js:630:51)
    at /node_modules/json-api/build/src/db-adapters/Mongoose/MongooseAdapter.js:641:25
    at Schema.eachPath (/node_modules/mongoose/lib/schema.js:506:5)
    at Function.getStandardizedSchema (/node_modules/json-api/build/src/db-adapters/Mongoose/MongooseAdapter.js:636:20)
    at DocumentationController.getTypeInfo (/node_modules/json-api/build/src/controllers/Documentation.js:117:40)
    at /node_modules/json-api/build/src/controllers/Documentation.js:69:39
    at Array.forEach (native)
    at new DocumentationController (/node_modules/json-api/build/src/controllers/Documentation.js:68:27)
    ...

When I remove the message property it works fine.

Remove `id` field from the auto-generated documentation

The only reason I can think of for including it is this: if the API accepted client-generated ids, it might want to include id as a field to indicate that the client can provide it. But given that APIs generated with this library don't take client-generated ids anyway, it shouldn't be there for now.

Set Vary header properly

Really, I think this means we just need to add Vary: Accept to every response with the vary library.

We're clearly varying on accept for the documentation request, since we actually have distinct html and json representations. But we're also varying on accept for the api requests, since invalid Accept headers can trigger a 406 for which caches can't use the 200 response.

Testing the example

Hi,

When i test the example, currently it gives me this error :

/tmp/json-api-example/src/index.js:25
  , Controller = new API.controllers.API(registry);
                 ^
TypeError: undefined is not a function
    at Object.<anonymous> (/tmp/json-api-example/src/index.js:25:18)
    at Module._compile (module.js:460:26)
    at Object.Module._extensions..js (module.js:478:10)
    at Module.load (module.js:355:32)
    at Function.Module._load (module.js:310:12)
    at Function.Module.runMain (module.js:501:10)
    at startup (node.js:129:16)
    at node.js:814:3

Validate the type on incoming requests

This should just be another simple check in the APIController to make sure the adapter for the type requested exists. Will handle cases in which bad requests are routed to the library, as happened in #9 (comment)

More tests!

This library now has pretty good test coverage (about 90%, with 265 tests), but still, it'd be good to:

  • Add "integration" tests for the JSON version of the auto-generated documentation, since its format has already been subject to a couple regressions in the past.
  • Make sure that each endpoint type/method has at least basic tests, to catch major regressions. Most do already, but there may be one or two that don't. These tests could be as simple as "did it return 200 with data even vaguely right (e.g. non-empty)".
  • Add tests for the most-complex features of the spec, like nested includes.

The lowest priority is to write tests for the "middle complexity" features, like precise status codes and the like. Not that those aren't important, but they're just lower priority.

Failed PUTs return 200

If you PUT to a resource, and the data you're PUTting is in the right format but the db rejects it (e.g. because it fails to pass validation rules)...

  1. The resource isn't actually modified in the database (good!), but
  2. the server returns a 200 with the body holding the contents of the resource as it would be if the update had succeeded.

Make “transforms” more flexible by allowing async behavior

I’d like to use the beforeSave function not so much as a transformation, but as a place to do some asynchronous work, and have the ability to carry out or cancel the save based on the results of the async task.

For example, the specific case I have right now is a resource that represents a file in an S3 bucket: I want to perform an upload to S3, which upon succeeding, will continue to save to Mongo. If the bucket upload fails, though, I want to be able to reject or something and kick off an error response.

A related thought that I’ve been thinking for some time is that maybe it would be worth exploring making each of these steps a distinct middleware. Although, that would be a function signature specific to Connect and Express servers, so maybe that doesn’t jive with the project’s vision. At any rate, I think these transforms could detect whether a Promise is being returned, and wait for it to settle if needed.

What do you think, @ethanresnick?

Relationship self-links should be contained in links object

Relationship links should be contained in a links object but json-api currently adds the self-link as a plain string to the relationship object.
Actual:

"relationships": {
  "author": {
    "data": { ... },
    "self": "example.com/books/12345/relationships/author"
  }
}

Expected:

"relationships": {
  "author": {
    "data": { ... },
    "links": {
      "self": "example.com/books/12345/relationships/author"
    }
  }
}

Multi-level include paths

I wanted to experiment a bit with multi-level include paths but noticed that they aren't implemented yet. Maybe you could help me out and tell me which result I should expect when running following query wrt the json-api example. In addition to the example please assume that there is a to-many relationship friends of type people for each people type.

http://127.0.0.1:3000/schools/abc...?include=liaisons,liaisons.friends

Would I get a result looking like the following?


{
  "data": {
   // ...
    "links": {
      "liaisons": {
        "linkage": [{ "type": "people", "id": "..." }, ...],
        "friends": {
          "linkage": [{"type: "people", "id":"..."}, ...]
        }
      }
    }
  }
}

PostgreSQL Adapter

The documentation told me to open this issue! :)
Anyway I would like to try this with a PostgreSQL database, so an adapter for that would be welcome.

If there were some additional information on how adapters are supposed to work I might be able to look into this myself.

sendError always sending 500s

It appears when you use sendError like suggested in the sample project it always hits the default case and sends a general unknown error.

{"errors":[{"status":"500","title":"An unknown error occurred while trying to process this request."}]}

After diving through the code a bit it seems to only work if you send it an instance of APIError. Is that the case or am I missing something?

If that is so, perhaps the documentation should be updated slightly? Something along the lines of:

var API = require('json-api');
var APIError = API.types.Error;
...
apiRouter.use(function(req, res, next) {
    Front.sendError(new APIError(404, undefined, 'Not Found'), req, res);
});

Babel as dev-dependency

Currently we are trying to execute ES6 tests in a module where we have the json-api as a dependency. When I require 'babel/register' in one of my build steps I get the error only one instance of babel/polyfill is allowed
The problem is, that we have two different versions of babel as dependencies (the one we specified in our top-level-module and the other one which is the json-api's dependency) and both try to add the `global._babelPolyfill´. Even though I've seen #11 this causes us serious headaches. Could you consider specifying babel as a dev-dependency again?

JSON API RC3 tracking

Hey there! I just wanted to let you know that JSON API has hit RC3, and we'd like client libraries to start implementing it like it was 1.0, as a final check that we're good for 1.0. I wanted to make this issue to let you know!

json-api/json-api#484

How do I remove a particular resource from the response entirely?

I can get the beforeRender hook to remove specific fields, but I can't seem to get it to remove the resource altogether.

Looking at the code, it seems as though the way to do it is to replace the resource with undefined, but that logic is only applied to Collections and returning undefined from the beforeRender hook just results in a hung request for me.

Should filtering be supported?

I’m opening this to discuss support of filtering. I started by needing some filtering and wondering if this package already supported it. I couldn’t find anything about it in the docs, but I see some code floating around in do-get.js. I can’t figure out exactly how to use it or what syntax is supported, at least not without spending a lot of time debugging and reverse engineering it.

But, filtering isn’t actually part of the JSON API spec, and there are bound to be cases where we need support for more complex cases than it makes sense to incorporate into this package. I think it would make sense to remove any built-in implementation of filtering. But right now, it doesn’t look like there’s a place we (users of this package) can add our own implementations back in. Or maybe there is, but it’s not obvious or not documented?

Have you given any thought to a good way forward?

PUT handlers hang

If you try to do this…

app.put('/people/:id', handler)

…requesting PUT /people/1 results in a hang. 404 would probably be best, or a crash with a nice error message telling the dev to use PATCH.

Apply transforms to linkage

Right now, when the response data being transformed is:

  • a single Resource (e.g. for non-bulk PATCH/POST requests, or requests to GET a single resource), the beforeSave/beforeRender function is called once, with the Resource as an argument
  • a Collection, the beforeSave, beforeRender function is called once for each resource, with the Resource as an argument.
  • a Linkage object (e.g. for requests to relationship endpoints), the linkage passes through untransformed. This is unacceptable, since linkage may need to be transformed too, e.g. if the server maps internal ids to different external ones, or wants to transparently rename a type.

However, I want to maintain the constraint that the transform functions are always called with a single resource object, so that users writing beforeSave/beforeRender functions don't have to do any switching on types (which the codebase already does way too much of, in part because the types aren't expressive enough).

So, my inclination, when linkage is the subject of the transform, is to wrap the linkage in an empty Resource object (preferably of the correct type, and preferably with the linkage at the right relationship path) and pass that Resource to the transform, for consistency.

This shouldn't cause any problems because, if linkage needs to be transformed, it should be transformed in the same way whether it's returned alone, or as part of a larger resource or in a collection of resources. However, we could also pass the original/raw object Linkage|Collection|Resource object as an extra argument to the transform function, just in case there are some edge cases for which knowing the original subject of the transform is important.

Handle "self" link protocol when proxied

I currently have json-api setup behind a Nginx proxy. The proxy sets up a SSL tunnel. All the clients speak only over HTTPS, but everything behind the proxy is only HTTP.

Self links derive their protocol from the incoming request protocol only. The simple fix is to use the traditional X-Forwarded-Proto header instead (if set).

There is probably a better way to handle this, but I'd like to get your thoughts before I post a PR. I just change this line to this:

it.uri = (req.get('X-Forwarded-Proto') || req.protocol) + "://" + req.get("Host") + req.originalUrl;

Dasherizing member names

The JSON API spec recommends that member names be dasherized. I think we should be dasherizing by default, but there probably needs to be an easy way to opt out of it. There are also some challenges to un-dasherizing (camelizing) some names. For instance, say you have a Mongoose schema like:

let PersonSchema = new mongoose.Schema({
  name: String,
  imageURL: String
});

It’s easy to dasherize imageURL to image-url on outgoing responses, but on an incoming POST request, for example, you might have a request body that looks like

{
  "data": {
    "type": "people",
    "attributes": {
      "name": "Andrew",
      "image-url": "http://placehold.it/100x100"
    }
  }
}

the camelization of image-url will result in imageUrl rather than imageURL.

So, I’m wondering if we should build in a way to customize the dasherized/camelized relationship. In my project, I’m handling this by adding a virtual field called imageUrl on the schema.

We could kill two birds (the “opt-in/opt-out” bird and the “customized transform” bird) with one stone by allowing users to specify a pair of functions when defining resources:

import camelize from 'camelize';
import dasherize from 'dasherize';

registry.type('people', {
  adapter: adapter,
  urlTemplates: {
    "self": "/people/{id}"
  },
  transformMemberNames: {
    fromRequest: camelize, // request comes in dasherized, camelize it before accessing store
    fromStore: dasherize   // names are camelized in store, dasherize before sending response
  }
});

Maybe that particular case, simple camelize/dasherize functions could be the default, but it would be easy to override on a per-resource basis, checking for special cases like

fromRequest: dasherized => {
  if (dasherized === 'image-url') {
    return 'imageURL';
  }
  return camelize(dasherized);
}

or to opt out of transformations:

transformMemberNames: {
  fromRequest: name => name,
  fromStore: name => name
}

I’m definitely not set on the exact naming above, maybe fromRequest and fromStore would be better as deserialize and serialize. But what do you think of going in this direction?

Incorrect filtering leads to problems

GET /people?filter[simple][name][going][deeper][gives][errors] gives a 500 error

It might also be helpful for newcomers if GET /people?filter[name]=John gave an error of some sort. It took me a while to find [simple].

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.