Code Monkey home page Code Monkey logo

custom-plugin-lib's People

Contributors

akelity avatar allevo avatar chiararelandini avatar davidebianchi avatar dependabot-preview[bot] avatar elmalakomar avatar epessina avatar fabionappi avatar filipporezzonico avatar fredmaggiowski avatar giogia avatar giuliowaitforitdavide avatar greenkeeper[bot] avatar ianfar96 avatar ilteoood avatar jgiola avatar lucascanna avatar malta895 avatar riccardozambito avatar silversoul93 avatar simone-paglino avatar simonebergonzi avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

custom-plugin-lib's Issues

Microservice gateway env variable is required, but it could not be deployed

Description

Here the MICROSERVICE_GATEWAY_SERVICE_NAME env variable is set as required, but the service could not exist.

The presence of this variable could bring to use the getServiceProxy function, which url is set to the microservice-gateway (which is unreachable).

An example usecase is a project which initially contains the microservice-gateway, and at a later time it will be removed. If a service called to the microservice-gateway, it throws with some error.

Proposed feature

My proposal is to set the MICROSERVICE_GATEWAY_SERVICE_NAME env variable as not required.
If it is set (and it is not empty), the getServiceProxy remains as today (so the microservice-gateway service is called).

If, on the other hand, the env variable is not set (or is empty) and the service calls the getServiceProxy method, the service will throw a clear error. In this way, it is possible to catch the issue with the explaination on how to fix it.
If possible, it would be great if throwed error could be raised on service startup.

An in-range update of @types/node is breaking the build 🚨

The dependency @types/node was updated from 12.12.1 to 12.12.2.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

@types/node is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

returnAs: 'BUFFER' and allowedStatusCodes as options in delete with 204 gives error

The description of the bug or the rationale of your proposal

Passing to the delete function of a service both returnAs: 'BUFFER' and allowedStatusCodes: [...] inside options object, gives an error of a JSON input end of file (if the response has a 204 HTTP status code with 'No Content').

The description of the bug or the rationale of your proposal

Check the line of code where there's the JSON.parse command which creates this problem and make a conditional statement to not use it if there's no response.

A snippet of code for replicating the issue or showing the proposal usage if applicable

To replicate the issue:

nock('http://my-service-name')
.delete('/foo')
.reply(404)

const service = serviceBuilder('my-service-name')

try {
  await service.delete(
    '/foo',
    undefined,
    undefined,
    { returnAs: 'BUFFER', allowedStatusCodes: [204] }
  )
} catch (error) {
  assert.equal(error.message, 'Invalid status code: 404. Allowed: 204.')
}

The expected result for your bug

The test should give an error of JSON input end of file

Your environment

node: 15.10.0

custom-plugin-lib: 4.2.0

os: --

Missing `customMetrics` from DecoratedFastify type

Hi,

using custom-plugin-lib with Typescript I noticed that DecoratedFastify type is missing the optional field customMetrics containing all the custom metrics defined for a service.

Proposal

I would propose to add it, so that it is not necessary to extend the type to use custom metrics when they are enabled.
This could be achieved editing DecoratedFastify interface, adding the following field definition:

...
customMetrics?: Record<string, Metric<string>>
...

Your environment

node: v16
npm: 8.5.0

custom-plugin-lib: v4.3.2

An in-range update of fastify is breaking the build 🚨

Version 1.11.2 of fastify was just published.

Branch Build failing 🚨
Dependency fastify
Current Version 1.11.1
Type dependency

This version is covered by your current version range and after updating it in your project the build failed.

fastify is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).
  • coverage/coveralls: First build on greenkeeper/fastify-1.11.2 at 96.8% (Details).

Release Notes v1.11.2

Internals

  • Handle promises in the error handler with the same logic of normal handlers - #1134
  • Rename ContentTypeParser - #1123
  • after should not cause inject() to be called - #1132

Documentation

  • Add trivikr@ to the collaborators list - #1139
  • Updated ecosystem doc - #1137
Commits

The new version differs by 13 commits.

  • 4e047a8 Bumped v1.11.2
  • c40ea62 Add trivikr@ to the collaborators list (#1139)
  • 0a27c92 Correct typos in Github Issue Template (#1140)
  • 5b18645 Updated ecosystem doc (#1137)
  • 0a874b9 Handle promises in the error handler with the same logic of normal handlers (#1134)
  • cce1a85 Rename ContentTypeParser (#1123)
  • 6d302a5 Add test for error fixed in mcollina/avvio#74 (#1132)
  • 60b85e7 Update Validation-and-Serialization.md (#1124)
  • d6982ea Remove/Merge redundant decorate functions (#1120)
  • baeebef Updated standard to v12. (#1121)
  • 7c8401d Update ContentTypeParser.js (#1122)
  • a14397d ecosystem in alphabetical order
  • 8a0c618 Update Ecosystem.md (#1125)

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

An in-range update of @types/node is breaking the build 🚨

The dependency @types/node was updated from 12.7.0 to 12.7.1.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

@types/node is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).
  • coverage/coveralls: First build on greenkeeper/@types/node-12.7.1 at 97.245% (Details).

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Support custom system-wide TLS options

The feauture or bug you are proposing

Since #84 serviceBuilder support custom tls options; it would be a good feature to be able to configure the library for system-wide TLS configurations

The description of the bug or the rationale of your proposal

The use case we'd like to address is providing the user with the ability to configure their services with custom TLS certificates for all serviceProxies that are created; we figured that providing environment variables with file paths (that should be mounted as files, e.g k8s configmaps/secrets) would be a good solution; if these variables are provided the library will read them at boot and provide all serviceBuilder invocation with the proper tls options.

Note: If a serviceBuilder provides specific tls options they should override the system wide

Unable to use "+" character in query parameters

The feature or bug you are proposing

The + character is removed when parameters are parsed if the params are not URL encoded. E.g. using an "email" parameter with [email protected] value (in a GET) returns an empty array, instead of returning the array with the corresponding user.

The description of the bug or the rationale of your proposal

More precisely, the + character is substituted by fastify when parsing the query params, since the + char in a query parameters correspond to the whitespace character (see here for further info).

The expected result for your bug

The plugin should encode the URL, in such a way to avoid loosing special characters when they reach fastify.

However, we're not 100% sure that the plugin should perform the encoding automatically. Please also note that this feature might be breaking if someone is inserting the + char in the query parameter in order to insert a whitespace on purpose.

Allow the proxies functions to receive query parameters as string into the path of the request

The feauture or bug you are proposing

The proxies function, used by the proxies obtained with the getServiceProxy and getDirectServiceProxy functions, should allow the user to pass the queryParameters into the path of the request.

A snippet of code for replicating the issue or showing the proposal usage if applicable

const myProxy = service.getDirectServiceProxy('localhost:3000')
const response = await myProxy.get('/myPath?name=Mike&surname=Shinoda') //error

The expected result for your bug

The code above should correctly work.

Your environment

node: 12

custom-plugin-lib: v1.1.1

`Config` type in the `CustomService` is too restrictive

The feauture or bug you are proposing

The user should be able to use a config with any property value type.

The description of the bug or the rationale of your proposal

the type type CustomService<Config extends ServiceConfig = ServiceConfig> in the customPlugin namepsce is too restrictive with the extends ServiceConfig since ServiceConfig is a NodeJS.Dict<string | number> by default. The type then breaks whenever you have a property in you config that is not a stringor a number (ex. a boolean)

A snippet of code for replicating the issue or showing the proposal usage if applicable

The type breaks if you declare something as such:

type Env = {prop: boolean}
const customService = customPlugin<Env>(envJsonSchema)

The expected result for your bug

I would expect the ServiceConfig type to not be a NodeJS.Dict<string | number> but a NodeJS.Dict<unknown>

Your environment

node: v16.17.0
custom-plugin-lib: 5.1.3
os: Ubuntu

Env variables from file are not read

We created a new service using the custom-plugin-lib version 5.1.5, from the changelog it's possible to see that there is a bug involving the environment variables that has been fixed with the 5.1.5 itself but even with the latest version we encounter a problem at the start up.
It seems that the environment variable file (default.env or local.env) is not read from the service.

the command i'm using is

set -a && source default.env && npx lc39 build/index.js

or

set -a && source default.env && npm start

npm start is lc39 build/index.js

image

i was not able to find the solution in the changelog of the custom plugin or lc39, do you know what could be the problem?

Changing the version to 4.3.2 solves the problem.

isMiaHeaderInjected proxy headers if set in ADDITIONAL_HEADERS_TO_PROXY env

Describe the bug
When I set isMiaHeaderInjected option to false, the header to proxy set in ADDITIONAL_HEADERS_TO_PROXY env are proxied if I use the http client in the request object.

Furthermore, if I set the mia headers inside the ADDITIONAL_HEADERS_TO_PROXY list, them are proxied anyhow.

Expected behavior
I expect to not proxy headers if I set this option.
We have 2 possibilities:

  1. isMiaHeaderInjected do not forward proxy if set
  2. add another options, to avoid to proxy all headers and not only the headers flagged as mia-headers

Desktop (please complete the following information):

  • Version: 6.0.0

null clientType is converted to 'null' string

The feature or bug you are proposing

The library is converting the null value into the null string headers, e.g. for the CLIENTTYPE_HEADER_KEY. This results in forwarding that given header with the null string as value even when the header is actually not provided/forwarder.

The description of the bug or the rationale of your proposal

The library should not forward heraders when those are set to null, or at least forward those with an empty string instead.

Note that while this bug has been discovered for the CLIENTTYPE_HEADER_KEY, it is impacting also other headers.

A snippet of code for replicating the issue or showing the proposal usage if applicable

A possible solution can be replacing:

function getClientType(clientType) {
  return clientType || null
}

with:

function getClientType(clientType) {
  return clientType || ''
}

The expected result for your bug

The library should not forward a header if the header is set to null.

Your environment

node: 12.19.0-alpine
custom-plugin-lib: 4.2.0
os: --

getHttpClient from service does not forward mia headers

The docs do not discriminate between the getHttpClient functions decorated to the fastify instance and request instance, but there is a difference indeed.

Using the above function decorated to the fastify instance, the obtained client will not forward Mia headers; while using the same function decorated to the request instance, the obtained client will actually forward those headers.

I would expect that the client obtained from the fastify instance is still able to forward mia headers or at least point this difference in the docs.

environment

custom-plugin-lib: 5.1.3

Issue on validation when exposing a multipart/form-data API

The feauture or bug you are proposing

Issue on validating a multipart/form-data request

The description of the bug or the rationale of your proposal

The schema definition of a mutlipart/form-data part requires the following body structure:

{
  additionalProperties: false,
  properties: {
    part: {
      description: 'Binary content of the file to upload',
      format: 'binary',
      type: 'string',
    },
  },
  required: ['part'],
  type: 'object',
}

Calling the endpoint with a binary file uploaded through the part field does return a validation error which states: 'body.part must be string`

The expected result for your bug

I expect that this validation error should not happen

Your environment

node: 8.15.0

custom-plugin-lib: 4.3.2

os: 20.04.5 LTS (Focal Fossa), Ubuntu

Remove required env variables

The feauture or bug you are proposing

Here are set the default variables required by the cpl

const baseSchema = {
type: 'object',
required: [
USERID_HEADER_KEY,
GROUPS_HEADER_KEY,
CLIENTTYPE_HEADER_KEY,
BACKOFFICE_HEADER_KEY,
MICROSERVICE_GATEWAY_SERVICE_NAME,
],
properties: {
[USERID_HEADER_KEY]: {
type: 'string',
description: 'the header key to get the user id',
minLength: 1,
},
[USER_PROPERTIES_HEADER_KEY]: {
type: 'string',
description: 'the header key to get the user permissions',
minLength: 1,
default: 'miauserproperties',
},
[GROUPS_HEADER_KEY]: {
type: 'string',
description: 'the header key to get the groups comma separated list',
minLength: 1,
},
[CLIENTTYPE_HEADER_KEY]: {
type: 'string',
description: 'the header key to get the client type',
minLength: 1,
},
[BACKOFFICE_HEADER_KEY]: {
type: 'string',
description: 'the header key to get if the request is from backoffice (any truly string is true!!!)',
minLength: 1,
},
[MICROSERVICE_GATEWAY_SERVICE_NAME]: {
type: 'string',
description: 'the service name of the microservice gateway',
pattern: '^(?=.{1,253}.?$)[a-z0-9](?:[a-z0-9-]{0,61}[a-z0-9])?(?:.[a-z0-9](?:[-0-9a-z]{0,61}[0-9a-z])?)*.?$',
},
[ADDITIONAL_HEADERS_TO_PROXY]: {
type: 'string',
default: '',
description: 'comma separated list of additional headers to proxy',
},
[ENABLE_HTTP_CLIENT_METRICS]: {
type: 'boolean',
default: false,
description: 'flag to enable the httpClient metrics',
},
},
}

For example, there is also the MICROSERVICE_GATEWAY_SERVICE_NAME env var, also if microservice-gateway does not exists in project.

For my point of view, it's ok which are configurable but this env should not be required. Also, the MICROSERVICE_GATEWAY_SERVICE_NAME should be optional and enable/disable the pre/post decorator function (raising an error).

Decorate the axios response with the duration of the response.

The feature or bug you are proposing

Decorate the axios response with the duration of the response.

The description of the bug or the rationale of your proposal

Due to the need to proxy various calls from a system to another it could be useful to decorate the axios response to include the duration of the response.

A snippet of code for replicating the issue or showing the proposal usage if applicable

Here you can find an example of this implementation for the request-manager.

The expected result for your bug

It could be useful to use this value to log if the response doesn't respect a certain time limit or expose a metric to track the time of this kind of response.

Your environment

node: v16.16.0

custom-plugin-lib: v5.1.1

os: Ubuntu 20.04.4 LTS

Missing "prefix" parameter in Service creation of getDirectServiceProxyFromUrlString function

When a URL is passed to the getDirectServiceProxyFromUrlString function, the created Service object does not contain the path of the passed URL as prefix.

The code snippet below shows the way in which the Service object is built:

return serviceBuilder(
    completeUrl.hostname,
    requestMiaHeaders,
    {
      protocol: completeUrl.protocol,
      port: completeUrl.port,
      ...baseOptions,
    })

Example: if the URL is http://example.com/path, /path is not considered when building the Service object.

Default parsing of response payload to JSON with statusCode 200 throw error if the payload is empty

The proxy obtained by getDirectServiceProxy can not handle empty responses with status code = 200, because return the following syntax error: SyntaxError: Unexpected end of JSON input.

This error is thrown cause of automatic parse of response to JSON, in fact setting the returnAs option to BUFFER it's possible to go around the problem.

Workaround example:

const proxy = request.getDirectServiceProxy(BASE_URL)
const response = await proxy.post( `myURL`,  body,  querystring,  { returnAs: 'BUFFER'  })

DecoratedRequest typing support to Query, Headers and Params

The feauture or bug you are proposing

Extend DecoratedRequest TypeScript support to Querystring, Headers and Params too.

The description of the bug or the rationale of your proposal

Right now the [DecoratedRequest interface](https://github.com/mia-platform/custom-plugin-lib/blob/9f20da345f208c391527c9c74e4bb7bcbc2955c2/index.d.ts#L60) allows to specify the Generic only for the inner FastifyRequest body, meaning that if I can't specify the shape on any other request parameter (being Querystring, Headers or Params).

A snippet of code for replicating the issue or showing the proposal usage if applicable


interface HelloQueryParameters {
    who?: string
}

interface HelloRequest {
    Querystring?: HelloQueryParameters
}

service.addRawCustomPlugin('GET', '/hello', async function (request:DecoratedRequest<HelloRequest>, reply:FastifyReply<any>) {
        return { message: `Hello ${request.getUserId() || (request.query as HelloQueryParameters).who}` }
}, schema)

In the snippet above, request.query is threated as unknown

The expected result for your bug


service.addRawCustomPlugin('GET', '/hello', async function (request:DecoratedRequest<HelloRequest>, reply:FastifyReply<any>) {
        return { message: `Hello ${request.getUserId() || request.query.who}` }
}, schema)

Your environment

node: v12.17.0
custom-plugin-lib: @mia-platform/[email protected]
os: MacOS

Default logs in TRACE or DEBUG level

The description of the bug or the rationale of your proposal

It's quite common for developers to forget to implement some logs which are essentials to debug a service behavior and the integration with external systems. Some of this logs are always the same, and the developer should take them for granted, so that she can focus on logs for her specific domain.
Here is the proposal for some of this default (or boilerplate) logs:

  1. For each route defined by means of addRawCustomPlugin, the following logs should be set:
    a. a TRACE/DEBUG log of the entire request with headers and payload
    b. a TRACE/DEBUG log of the entire response with headers and payload

Sensitive information should be take into account in order not to accidentally log it. A simple solution for this problem could consist of a simple flag opt-out boolean flag to be set in the json schema of the endpoint for all the fields that should be omitted from the logs (like all the personal information
Here's an example of such a JSON schema:

{
 
  "headers": {
    "type": "object",
    "properties": {
        "authorization": {"type":"string", "hide": true"}
     }
   },
  "body": {
    "type": "object",
    "properties": {
       "name": {"type": "string"},
       "surname": {"type": "string"},
       "taxCode": {"type": "string", "hide": true"}
    },
  },
  "response": {
      "200": {
        "type": "object",
       "properties": {
            "medicalRecordID": {"type": "string", "hide": "true"}
        }
      }
   }
  1. For each call that is directed to an external service by means of a service proxy:
    a. a TRACE/DEBUG log with all the request that is forwarded to the external service
    b. a TRACE/DEBUG log with all the response that is received from the external service

With these default/boilerplate logs, the developer should not remember to add noisy logs to her code, and she would know that it only takes to set the right LOG_LEVEL environment variable to have all the information she needs to debug a problem.

getServiceProxy does not return on 204 response and returnAs: 'BUFFER' if using node >= 13.0.0

The feauture or bug you are proposing

getServiceProxy does not return on 204 response and returnAs: 'BUFFER'

The description of the bug or the rationale of your proposal

If the service is using a version of node >= 13.0.0 and if the getServiceProxy is used to call a DELETE on the microservice-gateway and the response is a 204 No Content, the service get stuck.

The problem seams to be in the simple-concat function called by simpleGet.concat in simple-get/index.js that got stuck on the never called data and end event.

simpleGet.concat = (opts, cb) => {
  return simpleGet(opts, (err, res) => {
    if (err) return cb(err)
    concat(res, (err, data) => {
    if (err) return cb(err)
      if (opts.json) {
        try {
          data = JSON.parse(data.toString())
        } catch (err) {
          return cb(err, res, data)
        }
      }
      cb(null, res, data)
    })
  })
}

simple-concat

module.exports = function (stream, cb) {
  var chunks = []
  stream.on('data', function (chunk) {
    chunks.push(chunk)
  })
  stream.once('end', function () {
    if (cb) cb(null, Buffer.concat(chunks))
    cb = null
  })
  stream.once('error', function (err) {
    if (cb) cb(err)
    cb = null
  })
}

A snippet of code for replicating the issue or showing the proposal usage if applicable

Teorically it should be possible to reproduce the bug using this code snippet

const query = {
    bucket: 'TestBucket1',
  }

  const customProxy = getServiceProxy(MICROSERVICE_GATEWAY_SERVICE_NAME, { port: 3001 })
  nock('http://localhost:3001', { 'encodedQueryParams': true })
    .delete('/media-storage/media-id', {})
    .query({ 'bucket': 'TestBucket1' })
    .reply(204, '', [
      'date',
      'Wed, 21 Apr 2021 16:40:26 GMT',
      'content-type',
      'application/json; charset=utf-8',
      'content-length',
      '4',
      'Connection',
      'close',
    ])

  let result
  try {
    result = await customProxy.delete('/media-storage/media-id', {}, query, {
      headers: {
        miauserproperties: '{"permissions": ["MediaStorage.TestBucket1"]}',
      },
      returnAs: 'STREAM' })
  } catch (error) {
    console.log('error', error.message)
  }

but actually i managed to reproduce the bug only calling the actual microservice gateway.

The expected result for your bug

The called endpoint should correctly return a response

Your environment

node: ^13.0.0 || ^14.0.0

custom-plugin-lib: 2.3.0

os: tested on k8s

Make service proxy available for testing

It's very common to have the necessity to unit test a module which relies on a service proxy, which it receives at runtime with the method getDirectServiceProxy, typically by dependency injection. In this case, the only way is the manually build a stub of the service proxy in order to do assertions about the calls to the proxed service.
It would be useful to have such a stub (builder?) available inside the lib and requirable by the unit test.

const {ServiceProxyStub} = require('@mia-platform/custom-plugin-lib')

const MyService = require('...')

const serviceStub = new ServiceProxyStub()
const myService = new MyService(serviceStub)

myServive.doStuff()
assertCallsToService(serviceStub)

The alternative is to make the getDirectServiceProxy method available outside the context of the custom-plugin so that the test can create an actual service proxy and then use libraries like nock to intercept the http calls (it's still stubbing at the http level).

const nock = require('nock')

const {getDirectServiceProxy} = require('@mia-platform/custom-plugin-lib')

const MyService = require('...')

const service = getDirectServiceProxy('crud-service', {...})
const myService = new MyService(service)

setupInterceptors(nock)
myServive.doStuff()
assertInterceptors(nock)

Should forward the request id in internal calls

The feature you are proposing

In service calls, we should automatically forward the platform headers (like x-request-id).

A snippet of code for replicating the issue or showing the proposal usage if applicable

I suggest two possible way to handle this:

  1. add an internal proxy function
  2. add an option to declare if default platform headers should be passed or not.

All this solutions, concat platform headers with headers set in ADDITIONAL_HEADERS_TO_PROXY env var

Certificates that are used in tests will expire in the far future

In repo tests/fixture/keys there are certificates (client.crt, server.crt, ca.crt) that will expire on 3 September 3061.
So far it's not a problem, but it could be useful to keep track of this issue.
A possible solution is to generate new certificates every time tests are run or update this certificates before their expire date.

Unable to use `if-then-else` syntax at root level in json-schema of envs

The feauture or bug you are proposing

User should be able to use if-then-else, oneOf, anyOf at root level of json-schema.

The description of the bug or the rationale of your proposal

Since here the library makes a trasformation of the json-schema user defined, any if-then-else, oneOf, anyOf ecc the user used are gonna be lost and the applied json-schema to the environment variable will be something different from what user expected.

A snippet of code for replicating the issue or showing the proposal usage if applicable

Pass the following json-schema to the library:

{
  type: 'object',
  required: [],
  if: {
    properties: {
      FLAG: { type: 'boolean', const: true}
    }
  },
  then: {
      required: [
        'BAR'
      ],
      properties: {
        FLAG: { type: 'boolean', const: true},
        BAR: { type: 'string' }
      }
    },
    else: {
      required: [
        'FOO'
      ],
      properties: {
        FLAG: { type: 'boolean', const: false},
        FOO: { type: 'string' }
      }
    }
}

The resulting json-schema applied to the envs is gonna be this:

{
  "type": "object",
  "required": [
    "USERID_HEADER_KEY",
    "GROUPS_HEADER_KEY",
    "CLIENTTYPE_HEADER_KEY",
    "BACKOFFICE_HEADER_KEY",
    "MICROSERVICE_GATEWAY_SERVICE_NAME"
  ],
  "properties": {
    "USERID_HEADER_KEY": {
      "type": "string",
      "description": "the header key to get the user id",
      "minLength": 1
    },
    "USER_PROPERTIES_HEADER_KEY": {
      "type": "string",
      "description": "the header key to get the user permissions",
      "minLength": 1,
      "default": "miauserproperties"
    },
    "GROUPS_HEADER_KEY": {
      "type": "string",
      "description": "the header key to get the groups comma separated list",
      "minLength": 1
    },
    "CLIENTTYPE_HEADER_KEY": {
      "type": "string",
      "description": "the header key to get the client type",
      "minLength": 1
    },
    "BACKOFFICE_HEADER_KEY": {
      "type": "string",
      "description": "the header key to get if the request is from backoffice (any truly string is true!!!)",
      "minLength": 1
    },
    "MICROSERVICE_GATEWAY_SERVICE_NAME": {
      "type": "string",
      "description": "the service name of the microservice gateway",
      "pattern": "^(?=.{1,253}.?$)[a-z0-9](?:[a-z0-9-]{0,61}[a-z0-9])?(?:.[a-z0-9](?:[-0-9a-z]{0,61}[0-9a-z])?)*.?$"
    },
    "ADDITIONAL_HEADERS_TO_PROXY": {
      "type": "string",
      "default": "",
      "description": "comma separated list of additional headers to proxy"
    }
  },
  "additionalProperties": false
}

even though user defined something different.

The expected result for your bug

The schema applied to the envs should be what users define. If adding baseEnvs is strictly necessary, then they should be merged properly based on the structure of the users defined schema.
The best practice, in my opinion, should be that the library exports the base schema and users should handle them by hand.

Your environment

node: v14.19.3

custom-plugin-lib: v4.2.0

os: Ubuntu 20.04 LTS

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.