Code Monkey home page Code Monkey logo

graphql-upload-minimal's Introduction

graphql-upload-minimal

npm version CI status

Minimalistic and developer friendly middleware and an Upload scalar to add support for GraphQL multipart requests (file uploads via queries and mutations) to various Node.js GraphQL servers.

Acknowledgements

This module was ⚠️ forked from the amazing graphql-upload. The original module is exceptionally well documented and well written. It was very easy to fork and amend. Thanks Jayden!

I needed something simpler which won't attempt doing any disk I/O. There were no server-side JavaScript alternative modules for GraphQL file uploads. Thus, this fork was born.

Differences to graphql-upload

Single production dependency - busboy

  • Results in 9 less production dependencies.
  • And 6 less MB in your node_modules.
  • And using a bit less memory.
  • And a bit faster.
  • Most importantly, less risk that one of the dependencies would break your server.

More standard and developer friendly exception messages

Using ASCII-only text. Direct developers to resolve common mistakes.

Does not create any temporary files on disk

  • Thus works faster.
  • Does not have a risk of clogging your file system. Even on high load.
  • No need to manually destroy the programmatically aborted streams.

Does not follow strict specification

You can't have same file referenced twice in a GraphQL query/mutation.

API changes comparing to the original graphql-upload

  • Does not accept any arguments to createReadStream(). Will throw if any provided.
  • Calling createReadStream() more than once per file is not allowed. Will throw.

Otherwise, this module is a drop-in replacement for the graphql-upload.

Support

The following environments are known to be compatible:

See also GraphQL multipart request spec server implementations.

Setup

To install graphql-upload-minimal and the graphql peer dependency from npm run:

npm install graphql-upload-minimal graphql

Use the graphqlUploadKoa or graphqlUploadExpress middleware just before GraphQL middleware. Alternatively, use processRequest to create a custom middleware.

A schema built with separate SDL and resolvers (e.g. using makeExecutableSchema) requires the Upload scalar to be setup.

Usage

Clients implementing the GraphQL multipart request spec upload files as Upload scalar query or mutation variables. Their resolver values are promises that resolve file upload details for processing and storage. Files are typically streamed into cloud storage but may also be stored in the filesystem.

Express.js

Minimalistic code example showing how to upload a file along with arbitrary GraphQL data and save it to an S3 bucket.

Express.js middleware. You must put it before the main GraphQL sever middleware. Also, make sure there is no other Express.js middleware which parses multipart/form-data HTTP requests before the graphqlUploadExpress middleware!

const express = require("express");
const expressGraphql = require("express-graphql");
const { graphqlUploadExpress } = require("graphql-upload-minimal");

express()
  .use(
    "/graphql",
    graphqlUploadExpress({ maxFileSize: 10000000, maxFiles: 10 }),
    expressGraphql({ schema: require("./my-schema") })
  )
  .listen(3000);

GraphQL schema:

scalar Upload
input DocumentUploadInput {
  docType: String!
  file: Upload!
}

type SuccessResult {
  success: Boolean!
  message: String
}
type Mutations {
  uploadDocuments(docs: [DocumentUploadInput!]!): SuccessResult
}

GraphQL resolvers:

const { S3 } = require("aws-sdk");

const resolvers = {
  Upload: require("graphql-upload-minimal").GraphQLUpload,

  Mutations: {
    async uploadDocuments(root, { docs }, ctx) {
      try {
        const s3 = new S3({ apiVersion: "2006-03-01", params: { Bucket: "my-bucket" } });

        for (const doc of docs) {
          const { createReadStream, filename /*, fieldName, mimetype, encoding */ } = await doc.file;
          const Key = `${ctx.user.id}/${doc.docType}-${filename}`;
          await s3.upload({ Key, Body: createReadStream() }).promise();
        }

        return { success: true };
      } catch (error) {
        console.log("File upload failed", error);
        return { success: false, message: error.message };
      }
    },
  },
};

Koa

See the example Koa server and client.

AWS Lambda

Reported to be working.

const { processRequest } = require("graphql-upload-minimal");

module.exports.processRequest = function (event) {
  return processRequest(event, null, { environment: "lambda" });
};

Google Cloud Functions (GCF)

Possible example. Experimental. Untested.

const { processRequest } = require("graphql-upload-minimal");

exports.uploadFile = function (req, res) {
  return processRequest(req, res, { environment: "gcf" });
};

Azure Functions

Possible example. Working.

const { processRequest } = require("graphql-upload-minimal");

exports.uploadFile = function (context, req) {
  return processRequest(context, req, { environment: "azure" });
};

Uploading multiple files

When uploading multiple files you can make use of the fieldName property to keep track of an identifier of the uploaded files. The fieldName is equal to the passed name property of the file in the multipart/form-data request. This can be modified to contain an identifier (like a UUID), for example using the formDataAppendFile in the commonly used apollo-upload-link library.

Tips

  • Only use createReadStream() before the resolver returns; late calls (e.g. in an unawaited async function or callback) throw an error.

Architecture

The GraphQL multipart request spec allows a file to be used for multiple query or mutation variables (file deduplication), and for variables to be used in multiple places. GraphQL resolvers need to be able to manage independent file streams.

busboy parses multipart request streams. Once the operations and map fields have been parsed, Upload scalar values in the GraphQL operations are populated with promises, and the operations are passed down the middleware chain to GraphQL resolvers.

graphql-upload-minimal's People

Contributors

akofman avatar alex0007 avatar aseemk avatar carlmanaster avatar derschtefan avatar fberthelot avatar hongbo-miao avatar jaydenseric avatar koresar avatar lorenzodejong avatar mickvanduijn avatar mike-marcacci avatar miljoen avatar samcoenen avatar wtgtybhertgeghgtwtg avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

graphql-upload-minimal's Issues

Multiple file upload is showing as single array with promise, How to itreate and capture all file here

I am also struck with the same issue. Form data is sending correctly

image

Mutation code is given below

mutation multipleUpload($file : [Upload]!){
  multipleUpload(  
    file : $file,
    name: "This is a test"
    
  ){
    message
  }
}

Resolver code is given below

multipleUpload : async (args, contextValue) => {
    console.log(args.file);
    return {
        message: "File Uploaded"
    }
}

Console log of file gives only a single value , See screenshot below

image

Also if I running the upload 2nd time, It goes to a waiting stage , So I guess I didn't complete the promise

Multi-file upload

First of all, thank you for putting out this package, its a great alternative to the original (non-minimal) one.

I am using this with NestJS, we migrated away from the jaydenseric's recently. Single uploads work fine but there is an issue with multiple. I saw your mention of fieldName in the README, which we are passing.

Here is my browser's network request:

operations: {"query":"mutation ExampleMutation(\n  $input: ExampleMutationInput!\n) {\n  ExampleMutation(exampleMutationInput: $input) {\n    createdAt\n  }\n}\n","variables":{"input":{"files":[null,null,null],"price":0,}}}
map: {"FILE_1":["variables.input.files.0"],"FILE_2":["variables.input.files.1"],"FILE_3":["variables.input.files.2"]}
FILE_1: (binary)
FILE_2: (binary)
FILE_3: (binary)

and here is what it looks like in Nest/Node when I try to access the uploaded files:

[
  Promise {
    {
      fieldName: 'FILE_1',
      filename: 'FILE_Screenshot 2023-03-25 at 4.20.11 PM.png',
      mimetype: 'image/png',
      encoding: '7bit',
      createReadStream: [Function: createReadStream]
    },
    [Symbol(async_id_symbol)]: 2581,
    [Symbol(trigger_async_id_symbol)]: 2572,
    [Symbol(kResourceStore)]: I18nContext { lang: 'en', service: [I18nService], id: 1 }
  },
  Promise {
    <pending>,
    [Symbol(async_id_symbol)]: 2584,
    [Symbol(trigger_async_id_symbol)]: 2572,
    [Symbol(kResourceStore)]: I18nContext { lang: 'en', service: [I18nService], id: 1 }
  },
  Promise {
    <pending>,
    [Symbol(async_id_symbol)]: 2587,
    [Symbol(trigger_async_id_symbol)]: 2572,
    [Symbol(kResourceStore)]: I18nContext { lang: 'en', service: [I18nService], id: 1 }
  }
]

If I try to Promise.all() that array it just hangs. If I try to await the second (1) file, it just hangs. The first one works. Im at a loss as to what else to do here. Am I sending this to my server wrong from the browser?

Thanks for any help!

How to use with apollo-server-express?

Hello,
apollo-server-express got graphql-upload built in. How do I force it to use this package?

Some resolve hacks?

I am trying to make it work with nestjs but no success so far.

// main.ts
import { ValidationPipe } from "@nestjs/common";
import { NestFactory } from "@nestjs/core";
import {
  ExpressAdapter,
  NestExpressApplication,
} from "@nestjs/platform-express";
import express from "express";
import { graphqlUploadExpress } from "graphql-upload-minimal";

import { AppModule } from "./app.module";

const expressApp = express().use(
  "/graphql",
  graphqlUploadExpress({ maxFileSize: 10000000, maxFiles: 10 })
);
async function bootstrap() {
  const app = await NestFactory.create<NestExpressApplication>(
    AppModule,
    new ExpressAdapter(expressApp)
  );

  // Validation
  app.useGlobalPipes(new ValidationPipe());

  // Cors
  if (process.env.CORS_ENABLE === "1") {
    app.enableCors();
  }
  return app;
}

async function main() {
  const app = await bootstrap();
  await app.listen(process.env.PORT || 5000);
}

main();
// main app.module
import { FirebaseAdminModule } from "@aginix/nestjs-firebase-admin";
import { Module } from "@nestjs/common";
import { ConfigModule, ConfigService } from "@nestjs/config";
import { GraphQLModule } from "@nestjs/graphql";
import * as admin from "firebase-admin";

@Module({
  imports: [
    ConfigModule.forRoot({ isGlobal: true }),
    GraphQLModule.forRootAsync({
      useFactory: async (configService: ConfigService) => ({
        autoSchemaFile:
          configService.get("GRAPHQL_SCHEMA_DEST") || "./src/schema.graphql",
        debug: configService.get("GRAPHQL_DEBUG") === "1" ? true : false,
        playground:
          configService.get("GRAPHQL_PLAYGROUND_ENABLE") === "1" ? true : false,
        context: ({ req, res }) => ({ req, res }),
      }),
      inject: [ConfigService],
    }),
    FirebaseAdminModule.forRootAsync({
      useFactory: (configService) => ({
        credential: admin.credential.applicationDefault(),
        databaseURL: configService.get("FIREBASE_DATABASE_URL"),
        storageBucket: configService.get("FIREBASE_STORAGE_BUCKET"),
      }),
      inject: [ConfigService],
    }),
  ],
  providers: [AppService, AppResolver],
})
export class AppModule {}

GraphQL itself works, but uploads does not. I am getting this error:

 +4158ms
[Nest] 4692   - 06/06/2020, 10:18:51 AM   [ExceptionsHandler] Object:
[
  {
    "message": "Request disconnected during file upload stream parsing.",
    "extensions": {
      "code": "INTERNAL_SERVER_ERROR",
      "exception": {
        "stacktrace": [
          "BadRequestError: Request disconnected during file upload stream parsing.",
          "    at IncomingMessage.abort (/var/bb/node_modules/graphql-upload/lib/processRequest.js:89:33)",
          "    at Object.onceWrapper (events.js:421:28)",
          "    at IncomingMessage.emit (events.js:327:22)",
          "    at abortIncoming (_http_server.js:533:9)",
          "    at socketOnClose (_http_server.js:525:3)",
          "    at Socket.emit (events.js:327:22)",
          "    at TCP.<anonymous> (net.js:674:12)"
        ]
      }
    }
  }
]
 +510ms

Which looks like it is still using graphql-upload package and not this one :/

Typescript

Thank you for making this.
Typescript is wrong?
TS2305: Module '"graphql-upload-minimal"' has no exported member 'graphqlUploadExpress'.
It would be great to support it.

Thank you 👍

Apollo-graphql-express

With apollo-graphql-express, can I just use the middleware graphql-uploadExpress? I was using graphql-upload, and could never get it to work when uploading a file, I would get a POST error, missing body-parser?

So I'm just trying to find out how to install it. This looks great as it is much smaller.

Thanks,
Steve

TypeScript declaration files

Hi,
since you already defined all the types internally with JSDoc why not include them as .d.ts files inside the library. I could help out with that if you agree and make a PR. Another way would be to contribute to DefinitelyTyped if you don't want to include them into the library.

Throwing error doesn't work

I have noticed that throwing error in a resolver that handles file upload gives no result. Request hangs forever and api client never gets a response. Max file size limit doesn't throw any exceptions when limit is reached.

Similar questions:
meabed/graphql-upload-ts#78
#19

Typescript - forced dependency to koa/express

Hi I'm using typescript and express and when I try to use this library, I get a:

node_modules/graphql-upload-minimal/index.d.ts:4:58 - error TS2307: Cannot find module 'koa' or its corresponding type declarations.

4 import { DefaultContext, DefaultState, Middleware } from "koa";

And I don't really want to install koa on the side just for this. I'll try to trick this somehow but it'd be nice to fix it ;)

Question about how should a mutation be implemented using `graphqlUploadExpress`

Hi @koresar.

Following the example described in the README, it is likely fields defined as Upload in the schema would be an array of Promise. Is it expected?

If that's correct, then the example should be like the following:

// As Is:
const { createReadStream, filename /*, fieldName, mimetype, encoding */ } = await doc.file;

// To Be:
const { createReadStream, filename /*, fieldName, mimetype, encoding */ } = (await doc).file;

The environment I tested:

  • graphql: 16.6.0
  • graphql-upload-minimal: 1.5.4
  • @apollo/server: 4.3.2
  • express: 4.18.2

Getting a socket hang up error when trying to upload images with NestJS

I'm using NestJS and TypeORM and am currently getting a socket hang up error whenever I try to upload an image. The operation goes through successfully when passing only text fields, but I get the error below as soon as I attempt to upload images.

The error message on the FE initially showed as the following:

ServerParseError: Unexpected token '<', "<!DOCTYPE "... is not valid JSON

After some digging through the network tab in Chrome, I found the socket hang up error:

{
  "props": { "pageProps": { "statusCode": 500 } },
  "page": "/_error",
  "query": { "__NEXT_PAGE": "/api/graphql" },
  "buildId": "development",
  "isFallback": false,
  "err": {
    "name": "Error",
    "source": "server",
    "message": "socket hang up",
    "stack": "Error: socket hang up\n at connResetException (node:internal/errors:691:14)\n    at Socket.socketCloseListener (node:_http_client:420:25)\n    at Socket.emit (node:events:402:35)\n    at TCP.\u003canonymous\u003e (node:net:687:12)"
  },
  "gip": true,
  "scriptLoader": []
}

The mutation I'm currently attempting to upload images with is createPost, and I can see through debugging that the image file is making it to the BE, and that the file is even saved partially, albeit with half of the image missing.

For my FE application, I'm using Next.js, Apollo Client, and apollo-upload-client.

I've followed the instructions from NestJS documentation for adding custom scalars and have also tried the method detailed here with no success.

Repos to reproduce with:
FE: https://github.com/forrestwilkins/social-ui/tree/graphql-image-uploads
BE: https://github.com/forrestwilkins/social-api/tree/graphql-image-uploads

Lastly, thank you for creating this fork. I really appreciate your efforts here!

Production mode still requires koa

According to index.d.ts koa is always imported even if i want to use the package with express. Koa is listed in devDependencies so the problem affects production mode only.

graphqlUploadExpress and graphqlUploadKoa are missing in typings

heyho,

thanks notifying me at graphql-upload about your package.
i just took a quick look at it and seems to work in general, but your index.d.ts is lacking the typings graphqlUploadExpress,graphqlUploadKoa. When trying to import them in a typescript project you get the error that they are not exported by the package.

TypeError in processRequest() with AWS lambda

When I use processRequest in my AWS lambda function like this:

module.exports.hello = (event, context, callback) => {
    processRequest(event, null, { environment: "lambda" }).then((processed) => {
        // ...
    });
};

I get the following error:

ERROR	Unhandled Promise Rejection 	
{
    "errorType": "Runtime.UnhandledPromiseRejection",
    "errorMessage": "TypeError: Cannot read property 'once' of null",
    "reason": {
        "errorType": "TypeError",
        "errorMessage": "Cannot read property 'once' of null",
        "stack": [
            "TypeError: Cannot read property 'once' of null",
            "    at /var/task/node_modules/graphql-upload-minimal/public/process-request.js:378:14",
            "    at new Promise (<anonymous>)",
            "    at processRequest (/var/task/node_modules/graphql-upload-minimal/public/process-request.js:78:10)",
            "    at Runtime.module.exports.handle [as handler] (/var/task/handler.js:27:9)",
            "    at Runtime.handleOnce (/var/runtime/Runtime.js:66:25)"
        ]
    },
    "promise": {},
    "stack": [
        "Runtime.UnhandledPromiseRejection: TypeError: Cannot read property 'once' of null",
        "    at process.<anonymous> (/var/runtime/index.js:35:15)",
        "    at process.emit (events.js:310:20)",
        "    at process.EventEmitter.emit (domain.js:482:12)",
        "    at processPromiseRejections (internal/process/promises.js:209:33)",
        "    at processTicksAndRejections (internal/process/task_queues.js:98:32)"
    ]
}

Integration with Apollo Server

Hey! I'm using the Apollo Server (apollo-server-lambda) and I can see internally they are already integrating with graphql-upload:

const fileUploadHandler = (next: Function) => {
    const contentType = (
      event.headers['content-type'] || event.headers['Content-Type'] || ''
    ).toLowerCase();
    if (contentType.startsWith('multipart/form-data')
      && typeof processFileUploads === 'function') {
      const request = new FileUploadRequest() as IncomingMessage;
      request.push(
        Buffer.from(
          <any>event.body,
          event.isBase64Encoded ? 'base64' : 'ascii'
        )
      );
      request.push(null);
      request.headers = event.headers;
      processFileUploads(request, response, this.uploadsConfig || {})
        .then(body => {
          event.body = body as any;
          return next();
        })
        .catch(error => {
          throw formatApolloErrors([error], {
            formatter: this.requestOptions.formatError,
            debug: this.requestOptions.debug,
          });
        });
    } else {
      return next();
    }
  };

In the code above extracted from Apollo Server, the processFileUploads function imports the graphql-uploadand uses its processRequest method to reasign event.body to my lambda function.

From my understanding, graphql-upload-minimal is a fork that was written to work as a replacement for graphql-upload, being useful in serverless environments. The thing is though, Apollo Server is already handling that, and even though my lambda receives a base64 encoded event from ALB, in the resolver args, the file property already exposes to me a createReadStream method that worked just fine to create a buffer with the received chunks.

Given that:

• Are we supposed to fork apollo-server and remove everything related to graphql-upload before using this graphql-upload-minimal?
• If that is the case, what is the advantage of using this library?

Hanging Uploads

If my request body has an invalid mapped file, the request just hangs...

curl --request POST \
  --url http://localhost:4000/ \
  --header 'Content-Type: multipart/form-data; boundary=---011000010111000001101001' \
  --form 'operations={
	"query": "mutation($input: UploadInput!) { upload(input: $input) { ok } }",
	"variables": { "input": { "files": [null] } }
}' \
  --form 'map={ "0": ["variables.input.filesz"] }' \
  --form '[email protected]'

The above has a typo "filesz" instead of "files". This request hangs forever.

Tested with graphql-upload and it works and errors out. It even gives me a hint to the issue:

Variable \"$input\" got invalid value { files: [null], filesz: { resolve: [function], reject: [function], promise: {}, file: [Object] } }; Field \"filesz\" is not defined by type \"UploadInput\". Did you mean \"files\"?

Hanging when uploading with GCF environment

Hey there,

Great work! I am encountering some issue with file upload. Wondering if you have any clue where i did wrong. Here's the details:

Problem:
tried to upload a single file via CURL

curl -v -L http://localhost:5001/testAPI \
  -F operations='{ "query": "mutation($file: Upload!) { singleUpload(file: $file) }", "variables": { "file": null } }' \
  -F map='{ "0": ["variables.file"] }' \
  -F 0=@[PATH_TO_FILE]

Server just hanging, I can see the file upload being process. but for some reason, it's not calling the resolver.

server Code:

const express = require('express');
const {
  ApolloServer,
  gql,
} = require('apollo-server-cloud-functions');

const {
  graphqlUploadExpress,
  GraphQLUpload
} = require("graphql-upload-minimal");

const typeDefs = gql`
  # The implementation for this scalar is provided by the
  # 'GraphQLUpload' export from the 'graphql-upload' package
  # in the resolver map below.
  scalar Upload

  input DocumentUploadInput {
    docType: String!
    file: Upload!
  }

  type File {
    filename: String!
    mimetype: String!
    encoding: String!
  }

  type Query {
    # This is only here to satisfy the requirement that at least one
    # field be present within the 'Query' type.  This example does not
    # demonstrate how to fetch uploads back.
    otherFields: Boolean!
  }

  type Mutation {
    # Multiple uploads are supported. See graphql-upload docs for details.
    singleFile(file: DocumentUploadInput!): String!
    guff: String
  }
`;

const resolvers = {
  // This maps the `Upload` scalar to the implementation provided
  // by the `graphql-upload` package.
  Upload: GraphQLUpload,
  Query: {
    otherFields: () => {
      return true;
    },
  },

  Mutation: {
    guff: () => {
      return 'ok';
    },
    singleFile: async (root: any, args: any, context: any) => {
      console.log(args);
      const {file} = args;
      const { createReadStream } = await file;

      // // Invoking the `createReadStream` will return a Readable Stream.
      // // See https://nodejs.org/api/stream.html#stream_readable_streams
      createReadStream();
      return 'ok';
    },
  },
};

const server = new ApolloServer({
  typeDefs,
  resolvers,
});

export const TestHandler = server.createHandler({
  expressAppFromMiddleware(middleware: any) {
    const app = express();
    app.use(graphqlUploadExpress({ maxFileSize: 10000000, maxFiles: 10, environment: "gcf" }))
    app.use(middleware)
    return app;
  },
});

Cloud function handler:

const testAPI = functions
  .https.onRequest(TestHandler);

Environment:

  • running on firebase functions emulator
  • Node 14

Package version:

"apollo-server-cloud-functions": "^3.6.7",
"firebase-admin": "^10.1.0",
"firebase-functions": "^3.20.1",
"graphql": "^16.3.0",
"graphql-upload-minimal": "^1.4.0",

Looks like another middleware had processed this multipart request. Or maybe you are running in a cloud serverless function

I run processRequest(event, null, { environment: 'lambda' }); on AWS Lambda and get following error:

InternalError: graphql-upload-minimal couldn't find any files or JSON. Looks like another middleware had processed this multipart request. Or maybe you are running in a cloud serverless function? Then see README.md.

No additional middlewares applied, event is what I get directly from lambda. What is wrong?

Version is 1.2.1
I use graphql npm package as a server

Getting graphql-upload-minimal does not allow calling createReadStream() multiple times

Hi! Im using Node 14.15.3 with this package and I wanna forward the upload to another HTTP service but im getting the error:

graphql-upload-minimal does not allow calling createReadStream() multiple times. Please, consume the previously returned stream.

even though I explicitly only create the stream once. It must be happening elsewhere automatically.

Im using NestJS with express and I included this module's middleware in the startup. Any ideas @koresar ? Thanks!

  async uploadImage(
    userId: string,
    uploadData: ImageUploadArg,
  ): Promise<{ id: string }> {
    const { createReadStream, filename , mimetype, encoding } = await uploadData.imageFile;
    let data = new FormData();
    data.append('prefix', uploadData.prefix);
    data.append('category', uploadData.category);
    data.append('image', createReadStream())
    const resp = await this.httpService
      .post(
        `${this.configService.get('signalerxBaseUrl')}/leads/${userId}/images`,
        data,
        {
          headers: {
            ...data.getHeaders(),
            "Content-Type": `multipart/form-data  boundary=${data._boundary}`
          },
        },
      )
      .toPromise()
      .then((res) => {
        return res.data;
      })

statusCode field in HttpError

Some HTTP Error utils rely on a statusCode field to check if an object is an HttpError. Is it possible to add a statusCode field to the HttpError class that will mirror the status field? I don't mind creating the PR.

Uploading multiple files, only one file appears properly all other files coming as promise pending.

I am using graphql-upload-minimal and getting this error while uploading multiple files.
Some Uploads are coming as promise pending only first file appears properly.

Upload {
resolve: [Function (anonymous)],
reject: [Function (anonymous)],
promise: Promise { [Object] },
file: {
fieldName: '0',
filename: 'stock-photo.jpg',
mimetype: 'image/jpeg',
encoding: '7bit',
createReadStream: [Function: createReadStream]
}
},
Upload {
resolve: [Function (anonymous)],
reject: [Function (anonymous)],
promise: Promise { }
},
Upload {
resolve: [Function (anonymous)],
reject: [Function (anonymous)],
promise: Promise { }
},
Upload {
resolve: [Function (anonymous)],
reject: [Function (anonymous)],
promise: Promise { }
}

Please check this issue and get back to me ASAP.

Always getting 'POST body missing, invalid Content-Type, or JSON object has no keys.'

When implementing this in an Azure function, and following the example posted by another engineer implementing within an azure function I keep getting the response above.

const server = new ApolloServer({
  schema,
  async context(azure: { request: HttpRequest; context: azCtx }) {
    //is file upload
    if (azure.request.headers['content-type'].includes('multipart/form-data')) {
      //const body = await uploadFile(azure.context, azure.request);
      const body = await processRequest(azure.context, azure.request, { environment: 'azure' });
      console.log(body);
      azure.request.body = body;
    }

    const token = Auth.trimAuth(azure.request.headers.authorization);
    const requestIp = Auth.trimAuth(azure.request.headers['x-forwarded-for']);
    try {
      const patient = await Auth.validateToken(token);
      return new Context(azure.context, token, requestIp, patient);
    } catch {
      return new Context(azure.context, token, requestIp);
    }
  },
  formatError: (err) => {
    //Handler example for masking/hiding any global output
    // Don't give the specific errors to the client.
    if (err.message.startsWith('Database Error: ')) {
      return new Error('Internal server error');
    }
    // Otherwise return the original error. The error can also
    // be manipulated in other ways, as long as it's returned.
    return err;
  },
});

export default server.createHandler();

Altair 2021-10-31 09-06-52

Multiple Files

previously using graphql-upload we passed multiple files:

@Arg('files', () => GraphQLUpload) files: any[]

however with this fork, I'm seeing it only resolves the first file and the remainer are left in a pending state. (I can step through and see it's just resolve in upload is called once)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.