ethanresnick / json-api Goto Github PK
View Code? Open in Web Editor NEWTurn your node app into a JSON API server (http://jsonapi.org/)
License: GNU Lesser General Public License v3.0
Turn your node app into a JSON API server (http://jsonapi.org/)
License: GNU Lesser General Public License v3.0
Maybe a single Waterline-based "adapter" could work with mongo and sql databases using existing waterline plugins? https://github.com/balderdashy/waterline
I recently discovered this repo after attempting to roll my own. Thank you for your hard work on this, I've learned first hand that it's not as easy as it looks.
As I attempted to use your example, I discovered an undefined variable.
var registry = new API.ResourceTypeRegistry();
var controller = new API.controllers.Base(Registry);
var adapter = new API.adapters.Mongoose(models);
Is 'Registry' defined somewhere? My linter is throwing an error there.
If I change 'Registry' to 'registry' (as I assume it should be), I then get the following error when pinging the endpoint:
{"error":"TypeError: Cannot call method 'adapter' of undefined\n at prototype.GET (/node_modules/json-api/build/lib/controllers/Base.js:24:31)
Again, I totally appreciate your efforts here. Thanks.
I've been hearing a lot of good things about Koa, written by the team behind Express. It doesn't seem to have a json-api serializer yet, from what I can find. Have you thought about whether this project could also be used with Koa, in addition to Express? Or if not that, have you thought about writing a serializer similar to this one for Koa?
And into a pre-query step, since json-api document validation isn't an http level concern
I'm doing some testing with supertest:
request(app)
.delete '/some/' + someId.toString()
.set headers
.expect 'Content-Type', headers['Accept']
.expect 204
.end done
headers =
'Content-Type' : 'application/vnd.api+json; charset=utf-8'
'Accept' : 'application/vnd.api+json; charset=utf-8; supported-ext=bulk'
I get the following error:
Error: expected "Content-Type" of "application/vnd.api+json; charset=utf-8; supported-ext=bulk", got "application/vnd.api+json; supported-ext="bulk""
GET, POST and PATCH work very well
I see two meaningful types of readonly fields:
username
: settable to anything at first, but then not mutable.created-date
, modified-date
, and id
(when server-generated ids are in use).Right now, if the user wants one of their API fields to be readonly, whether final or server-managed, they have to manually add logic in the beforeSave
transform in the resource description, to 403 if the client attempts to change one of these readonly fields. (For the final case, they even have to inspect the request to check the method.)
But this use case is common enough that there should be a declarative solution. That is, the user should be able to simple set a "server-managed"
or "final"
key to true
in the resource description somewhere, and then a transform would automatically be added to handle the validation.
One natural place to put this key would be in the validation
key that exists under each field in the info
section. My issue with putting it here, though, is that it's not intuitive that values under info
could actually affect the behavior of the app. So, instead, I'm thinking that resource type descriptions could get a "behaviors"
key, structured like so:
{
"behaviors": {
"fields": {
"created-at": {
"server-managed": true
}
}
}
Adding this info under behaviors would then also need to be taken into account by the DocumentationController, to add it to the auto-generated docs.
The Restify framework shares syntax with Express with a few minor variations. Is Restify currently supported, and if not what would be required to ensure integration?
Just a nicety: at the moment, when the adapter's find method gets called, if the type that it tries to find was never provided to the adapter, rather than erroring, it just hangs indefinitely. (I stupidly forgot to add a new model to my new API.dbAdapters.MongooseAdapter({...})
call, and it took me forever to figure out what was going wrong.)
This is actually needed for 1.0 compliance. (I overlooked it earlier.)
It should be expect(foo).to.be.an("object")
The former will not throw an error. This means there are a lot of tests that aren’t actually testing anything. 😦
beforeRender: function() {
notAFunction()
}
Related: I can't figure out how I'm supposed to handle errors there. You mentioned something about the APIError
type, but throwing or returning that causes hangs too. How do I reject a request entirely?
It's the first link in this section: https://github.com/ethanresnick/json-api#example-api
Pass off incoming meta to some user function to store/whatever, and give the user some function through which to set meta info on the response should they wish. This would also presumably live in the resource type description, though I haven't thought about its design beyond that.
I.e. the following works.
PUT /resource-type/resource-id
{
"relationshipName": "newID"
}
Whereas the spec requires:
PUT /resource-type/resource-id
{
"links": {
"relationshipName": "newID"
}
}
app.post('/route', function(req, res) {
var error = new APIError(500)
Front.sendError(error, req, res)
})
Problematic line. Potentially related to #55.
For example, if I have the following schema
var PersonSchema = new Schema({
parent: {
type: Schema.ObjectId,
rel: 'Person'
}
})
mongoose.model('Person', PersonSchema)
and I GET request it, I get something like
{
"data": [
{
"id": "1",
"type": "person",
"attributes": {
"parent": "2"
},
"relationships": {}
}
]
}
instead of
{
"data": [
{
"id": "1",
"type": "person",
"attributes": {},
"relationships": {
"parent": {
"id": "2",
"type": "person"
}
}
}
]
}
I'm trying to get this package working with Ember Data. Ember Data is generating PUT requests, but I can't figure out how to respond to them. Is there an example somewhere?
Are there any plans to support custom query parameters. The standard allows custom query parameters.
In my particular case I have a custom dbAdapter and want to pass locale parameters to the adapter (e.g. /chapter/12345?Locale=en_us
).
Support auto-generating & including in the HTML copy-and-pasteable example request and response payloads based on each model. Should be relatively easy. Consider https://stripe.com/docs/api as a model for the layout.
It would appear that json-api does not support arrays with objects inside, take the following schema:
var discussionSchema = new Mongoose.Schema({
messageCount: Number,
lastIndex: Number,
message: [{
index: { type: Number, required: true },
parentMessage: Number,
author: { type: ObjectId, ref: 'User', required: true },
text: { type: String, required: true, validate: Validators.isLength(2, 1024) },
dateCreated: { type: Date, default: Date.now }
}]
});
It errors with the following stack:
TypeError: Cannot read property 'name' of undefined
at getFieldType (/node_modules/json-api/build/src/db-adapters/Mongoose/MongooseAdapter.js:630:51)
at /node_modules/json-api/build/src/db-adapters/Mongoose/MongooseAdapter.js:641:25
at Schema.eachPath (/node_modules/mongoose/lib/schema.js:506:5)
at Function.getStandardizedSchema (/node_modules/json-api/build/src/db-adapters/Mongoose/MongooseAdapter.js:636:20)
at DocumentationController.getTypeInfo (/node_modules/json-api/build/src/controllers/Documentation.js:117:40)
at /node_modules/json-api/build/src/controllers/Documentation.js:69:39
at Array.forEach (native)
at new DocumentationController (/node_modules/json-api/build/src/controllers/Documentation.js:68:27)
...
When I remove the message
property it works fine.
The only reason I can think of for including it is this: if the API accepted client-generated ids, it might want to include id
as a field to indicate that the client can provide it. But given that APIs generated with this library don't take client-generated ids anyway, it shouldn't be there for now.
Really, I think this means we just need to add Vary: Accept
to every response with the vary library.
We're clearly varying on accept for the documentation request, since we actually have distinct html and json representations. But we're also varying on accept for the api requests, since invalid Accept
headers can trigger a 406 for which caches can't use the 200 response.
Hi,
When i test the example, currently it gives me this error :
/tmp/json-api-example/src/index.js:25
, Controller = new API.controllers.API(registry);
^
TypeError: undefined is not a function
at Object.<anonymous> (/tmp/json-api-example/src/index.js:25:18)
at Module._compile (module.js:460:26)
at Object.Module._extensions..js (module.js:478:10)
at Module.load (module.js:355:32)
at Function.Module._load (module.js:310:12)
at Function.Module.runMain (module.js:501:10)
at startup (node.js:129:16)
at node.js:814:3
This should just be another simple check in the APIController to make sure the adapter for the type requested exists. Will handle cases in which bad requests are routed to the library, as happened in #9 (comment)
I found another place where the API can hang due to a missing rejection handler: https://github.com/ethanresnick/json-api/blob/master/src/db-adapters/Mongoose/MongooseAdapter.js#L184
The Promise that encapsulates the creation can reject if user’s model validation fails, or if index validation fails (like trying to violate a unique index).
GET /people/1
when a person with an id of 1 doesn't exist
This library now has pretty good test coverage (about 90%, with 265 tests), but still, it'd be good to:
data
even vaguely right (e.g. non-empty)".The lowest priority is to write tests for the "middle complexity" features, like precise status codes and the like. Not that those aren't important, but they're just lower priority.
If you PUT to a resource, and the data you're PUTting is in the right format but the db rejects it (e.g. because it fails to pass validation rules)...
How do you use this library? :D
I’d like to use the beforeSave
function not so much as a transformation, but as a place to do some asynchronous work, and have the ability to carry out or cancel the save based on the results of the async task.
For example, the specific case I have right now is a resource that represents a file in an S3 bucket: I want to perform an upload to S3, which upon succeeding, will continue to save to Mongo. If the bucket upload fails, though, I want to be able to reject
or something and kick off an error response.
A related thought that I’ve been thinking for some time is that maybe it would be worth exploring making each of these steps a distinct middleware. Although, that would be a function signature specific to Connect and Express servers, so maybe that doesn’t jive with the project’s vision. At any rate, I think these transforms could detect whether a Promise is being returned, and wait for it to settle if needed.
What do you think, @ethanresnick?
Relationship links should be contained in a links object but json-api currently adds the self-link as a plain string to the relationship object.
Actual:
"relationships": {
"author": {
"data": { ... },
"self": "example.com/books/12345/relationships/author"
}
}
Expected:
"relationships": {
"author": {
"data": { ... },
"links": {
"self": "example.com/books/12345/relationships/author"
}
}
}
I wanted to experiment a bit with multi-level include paths but noticed that they aren't implemented yet. Maybe you could help me out and tell me which result I should expect when running following query wrt the json-api example. In addition to the example please assume that there is a to-many relationship friends
of type people
for each people
type.
http://127.0.0.1:3000/schools/abc...?include=liaisons,liaisons.friends
Would I get a result looking like the following?
{
"data": {
// ...
"links": {
"liaisons": {
"linkage": [{ "type": "people", "id": "..." }, ...],
"friends": {
"linkage": [{"type: "people", "id":"..."}, ...]
}
}
}
}
}
The documentation told me to open this issue! :)
Anyway I would like to try this with a PostgreSQL database, so an adapter for that would be welcome.
If there were some additional information on how adapters are supposed to work I might be able to look into this myself.
If a sub-type doesn't specify a beforeSave
or beforeRender
transform, but its parent type does, the subtype should use the parent type's.
It appears when you use sendError
like suggested in the sample project it always hits the default case and sends a general unknown error.
{"errors":[{"status":"500","title":"An unknown error occurred while trying to process this request."}]}
After diving through the code a bit it seems to only work if you send it an instance of APIError
. Is that the case or am I missing something?
If that is so, perhaps the documentation should be updated slightly? Something along the lines of:
var API = require('json-api');
var APIError = API.types.Error;
...
apiRouter.use(function(req, res, next) {
Front.sendError(new APIError(404, undefined, 'Not Found'), req, res);
});
Currently we are trying to execute ES6 tests in a module where we have the json-api as a dependency. When I require 'babel/register' in one of my build steps I get the error only one instance of babel/polyfill is allowed
The problem is, that we have two different versions of babel as dependencies (the one we specified in our top-level-module and the other one which is the json-api's dependency) and both try to add the `global._babelPolyfill´. Even though I've seen #11 this causes us serious headaches. Could you consider specifying babel as a dev-dependency again?
Hey there! I just wanted to let you know that JSON API has hit RC3, and we'd like client libraries to start implementing it like it was 1.0, as a final check that we're good for 1.0. I wanted to make this issue to let you know!
I can get the beforeRender
hook to remove specific fields, but I can't seem to get it to remove the resource altogether.
Looking at the code, it seems as though the way to do it is to replace the resource with undefined
, but that logic is only applied to Collections and returning undefined
from the beforeRender
hook just results in a hung request for me.
It fails to remove the element when you try set an element to null when you're doing a PUT.
I assume there was a reason for accepting it along with application/vnd.api+json
, but right now any request with a body and that Content-Type hangs for me.
I’m opening this to discuss support of filtering. I started by needing some filtering and wondering if this package already supported it. I couldn’t find anything about it in the docs, but I see some code floating around in do-get.js
. I can’t figure out exactly how to use it or what syntax is supported, at least not without spending a lot of time debugging and reverse engineering it.
But, filtering isn’t actually part of the JSON API spec, and there are bound to be cases where we need support for more complex cases than it makes sense to incorporate into this package. I think it would make sense to remove any built-in implementation of filtering. But right now, it doesn’t look like there’s a place we (users of this package) can add our own implementations back in. Or maybe there is, but it’s not obvious or not documented?
Have you given any thought to a good way forward?
If you try to do this…
app.put('/people/:id', handler)
…requesting PUT /people/1
results in a hang. 404 would probably be best, or a crash with a nice error message telling the dev to use PATCH
.
Right now, when the response data being transformed is:
beforeSave
/beforeRender
function is called once, with the Resource as an argumentbeforeSave
, beforeRender
function is called once for each resource, with the Resource as an argument.However, I want to maintain the constraint that the transform functions are always called with a single resource object, so that users writing beforeSave
/beforeRender
functions don't have to do any switching on types (which the codebase already does way too much of, in part because the types aren't expressive enough).
So, my inclination, when linkage is the subject of the transform, is to wrap the linkage in an empty Resource object (preferably of the correct type, and preferably with the linkage at the right relationship path) and pass that Resource to the transform, for consistency.
This shouldn't cause any problems because, if linkage needs to be transformed, it should be transformed in the same way whether it's returned alone, or as part of a larger resource or in a collection of resources. However, we could also pass the original/raw object Linkage|Collection|Resource
object as an extra argument to the transform function, just in case there are some edge cases for which knowing the original subject of the transform is important.
I currently have json-api setup behind a Nginx proxy. The proxy sets up a SSL tunnel. All the clients speak only over HTTPS, but everything behind the proxy is only HTTP.
Self links derive their protocol from the incoming request protocol only. The simple fix is to use the traditional X-Forwarded-Proto
header instead (if set).
There is probably a better way to handle this, but I'd like to get your thoughts before I post a PR. I just change this line to this:
it.uri = (req.get('X-Forwarded-Proto') || req.protocol) + "://" + req.get("Host") + req.originalUrl;
The JSON API spec recommends that member names be dasherized. I think we should be dasherizing by default, but there probably needs to be an easy way to opt out of it. There are also some challenges to un-dasherizing (camelizing) some names. For instance, say you have a Mongoose schema like:
let PersonSchema = new mongoose.Schema({
name: String,
imageURL: String
});
It’s easy to dasherize imageURL
to image-url
on outgoing responses, but on an incoming POST request, for example, you might have a request body that looks like
{
"data": {
"type": "people",
"attributes": {
"name": "Andrew",
"image-url": "http://placehold.it/100x100"
}
}
}
the camelization of image-url
will result in imageUrl
rather than imageURL
.
So, I’m wondering if we should build in a way to customize the dasherized/camelized relationship. In my project, I’m handling this by adding a virtual field called imageUrl
on the schema.
We could kill two birds (the “opt-in/opt-out” bird and the “customized transform” bird) with one stone by allowing users to specify a pair of functions when defining resources:
import camelize from 'camelize';
import dasherize from 'dasherize';
registry.type('people', {
adapter: adapter,
urlTemplates: {
"self": "/people/{id}"
},
transformMemberNames: {
fromRequest: camelize, // request comes in dasherized, camelize it before accessing store
fromStore: dasherize // names are camelized in store, dasherize before sending response
}
});
Maybe that particular case, simple camelize/dasherize functions could be the default, but it would be easy to override on a per-resource basis, checking for special cases like
fromRequest: dasherized => {
if (dasherized === 'image-url') {
return 'imageURL';
}
return camelize(dasherized);
}
or to opt out of transformations:
transformMemberNames: {
fromRequest: name => name,
fromStore: name => name
}
I’m definitely not set on the exact naming above, maybe fromRequest
and fromStore
would be better as deserialize
and serialize
. But what do you think of going in this direction?
GET /people?filter[simple][name][going][deeper][gives][errors]
gives a 500 error
It might also be helpful for newcomers if GET /people?filter[name]=John
gave an error of some sort. It took me a while to find [simple]
.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.