mia-platform / custom-plugin-lib Goto Github PK
View Code? Open in Web Editor NEWLibrary that allows you to define Mia-Platform custom plugins easily
Home Page: https://www.mia-platform.eu/
License: Apache License 2.0
Library that allows you to define Mia-Platform custom plugins easily
Home Page: https://www.mia-platform.eu/
License: Apache License 2.0
Here the MICROSERVICE_GATEWAY_SERVICE_NAME env variable is set as required, but the service could not exist.
The presence of this variable could bring to use the getServiceProxy
function, which url is set to the microservice-gateway (which is unreachable).
An example usecase is a project which initially contains the microservice-gateway, and at a later time it will be removed. If a service called to the microservice-gateway, it throws with some error.
My proposal is to set the MICROSERVICE_GATEWAY_SERVICE_NAME
env variable as not required.
If it is set (and it is not empty), the getServiceProxy
remains as today (so the microservice-gateway
service is called).
If, on the other hand, the env variable is not set (or is empty) and the service calls the getServiceProxy
method, the service will throw a clear error. In this way, it is possible to catch the issue with the explaination on how to fix it.
If possible, it would be great if throwed error could be raised on service startup.
12.12.1
to 12.12.2
.This version is covered by your current version range and after updating it in your project the build failed.
@types/node is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.
There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.
Your Greenkeeper Bot 🌴
In the next major we may want to drop node 14 support
Passing to the delete function of a service both returnAs: 'BUFFER'
and allowedStatusCodes: [...]
inside options object, gives an error of a JSON input end of file (if the response has a 204 HTTP status code with 'No Content').
Check the line of code where there's the JSON.parse command which creates this problem and make a conditional statement to not use it if there's no response.
To replicate the issue:
nock('http://my-service-name')
.delete('/foo')
.reply(404)
const service = serviceBuilder('my-service-name')
try {
await service.delete(
'/foo',
undefined,
undefined,
{ returnAs: 'BUFFER', allowedStatusCodes: [204] }
)
} catch (error) {
assert.equal(error.message, 'Invalid status code: 404. Allowed: 204.')
}
The test should give an error of JSON input end of file
node: 15.10.0
custom-plugin-lib: 4.2.0
os: --
Hi,
using custom-plugin-lib
with Typescript I noticed that DecoratedFastify
type is missing the optional field customMetrics
containing all the custom metrics defined for a service.
I would propose to add it, so that it is not necessary to extend the type to use custom metrics when they are enabled.
This could be achieved editing DecoratedFastify interface, adding the following field definition:
...
customMetrics?: Record<string, Metric<string>>
...
node: v16
npm: 8.5.0
custom-plugin-lib: v4.3.2
Branch | Build failing 🚨 |
---|---|
Dependency | fastify |
Current Version | 1.11.1 |
Type | dependency |
This version is covered by your current version range and after updating it in your project the build failed.
fastify is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.
Internals
Documentation
The new version differs by 13 commits.
4e047a8
Bumped v1.11.2
c40ea62
Add trivikr@ to the collaborators list (#1139)
0a27c92
Correct typos in Github Issue Template (#1140)
5b18645
Updated ecosystem doc (#1137)
0a874b9
Handle promises in the error handler with the same logic of normal handlers (#1134)
cce1a85
Rename ContentTypeParser (#1123)
6d302a5
Add test for error fixed in mcollina/avvio#74 (#1132)
60b85e7
Update Validation-and-Serialization.md (#1124)
d6982ea
Remove/Merge redundant decorate functions (#1120)
baeebef
Updated standard to v12. (#1121)
7c8401d
Update ContentTypeParser.js (#1122)
a14397d
ecosystem in alphabetical order
8a0c618
Update Ecosystem.md (#1125)
See the full diff
There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.
Your Greenkeeper Bot 🌴
12.7.0
to 12.7.1
.This version is covered by your current version range and after updating it in your project the build failed.
@types/node is a direct dependency of this project, and it is very likely causing it to break. If other packages depend on yours, this update is probably also breaking those in turn.
There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.
Your Greenkeeper Bot 🌴
Since #84 serviceBuilder
support custom tls options; it would be a good feature to be able to configure the library for system-wide TLS configurations
The use case we'd like to address is providing the user with the ability to configure their services with custom TLS certificates for all serviceProxies that are created; we figured that providing environment variables with file paths (that should be mounted as files, e.g k8s configmaps/secrets) would be a good solution; if these variables are provided the library will read them at boot and provide all serviceBuilder invocation with the proper tls options.
Note: If a serviceBuilder provides specific tls options they should override the system wide
The +
character is removed when parameters are parsed if the params are not URL encoded. E.g. using an "email" parameter with [email protected]
value (in a GET) returns an empty array, instead of returning the array with the corresponding user.
More precisely, the +
character is substituted by fastify when parsing the query params, since the +
char in a query parameters correspond to the whitespace character (see here for further info).
The plugin should encode the URL, in such a way to avoid loosing special characters when they reach fastify.
However, we're not 100% sure that the plugin should perform the encoding automatically. Please also note that this feature might be breaking if someone is inserting the +
char in the query parameter in order to insert a whitespace on purpose.
The proxies function, used by the proxies obtained with the getServiceProxy
and getDirectServiceProxy
functions, should allow the user to pass the queryParameters
into the path
of the request.
const myProxy = service.getDirectServiceProxy('localhost:3000')
const response = await myProxy.get('/myPath?name=Mike&surname=Shinoda') //error
The code above should correctly work.
node: 12
custom-plugin-lib: v1.1.1
The user should be able to use a config with any property value type.
the type type CustomService<Config extends ServiceConfig = ServiceConfig>
in the customPlugin
namepsce is too restrictive with the extends ServiceConfig
since ServiceConfig
is a NodeJS.Dict<string | number>
by default. The type then breaks whenever you have a property in you config that is not a string
or a number
(ex. a boolean)
The type breaks if you declare something as such:
type Env = {prop: boolean}
const customService = customPlugin<Env>(envJsonSchema)
I would expect the ServiceConfig
type to not be a NodeJS.Dict<string | number>
but a NodeJS.Dict<unknown>
node: v16.17.0
custom-plugin-lib: 5.1.3
os: Ubuntu
We created a new service using the custom-plugin-lib version 5.1.5, from the changelog it's possible to see that there is a bug involving the environment variables that has been fixed with the 5.1.5 itself but even with the latest version we encounter a problem at the start up.
It seems that the environment variable file (default.env or local.env) is not read from the service.
the command i'm using is
set -a && source default.env && npx lc39 build/index.js
or
set -a && source default.env && npm start
npm start is lc39 build/index.js
i was not able to find the solution in the changelog of the custom plugin or lc39, do you know what could be the problem?
Changing the version to 4.3.2 solves the problem.
Describe the bug
When I set isMiaHeaderInjected
option to false, the header to proxy set in ADDITIONAL_HEADERS_TO_PROXY
env are proxied if I use the http client in the request object.
Furthermore, if I set the mia headers inside the ADDITIONAL_HEADERS_TO_PROXY
list, them are proxied anyhow.
Expected behavior
I expect to not proxy headers if I set this option.
We have 2 possibilities:
isMiaHeaderInjected
do not forward proxy if setDesktop (please complete the following information):
The library is converting the null
value into the null
string headers, e.g. for the CLIENTTYPE_HEADER_KEY
. This results in forwarding that given header with the null
string as value even when the header is actually not provided/forwarder.
The library should not forward heraders when those are set to null
, or at least forward those with an empty string instead.
Note that while this bug has been discovered for the CLIENTTYPE_HEADER_KEY
, it is impacting also other headers.
A possible solution can be replacing:
function getClientType(clientType) {
return clientType || null
}
with:
function getClientType(clientType) {
return clientType || ''
}
The library should not forward a header if the header is set to null.
node: 12.19.0-alpine
custom-plugin-lib: 4.2.0
os: --
The docs do not discriminate between the getHttpClient
functions decorated to the fastify instance and request instance, but there is a difference indeed.
Using the above function decorated to the fastify instance, the obtained client will not forward Mia headers; while using the same function decorated to the request instance, the obtained client will actually forward those headers.
I would expect that the client obtained from the fastify instance is still able to forward mia headers or at least point this difference in the docs.
custom-plugin-lib: 5.1.3
Issue on validating a multipart/form-data
request
The schema definition of a mutlipart/form-data
part requires the following body structure:
{
additionalProperties: false,
properties: {
part: {
description: 'Binary content of the file to upload',
format: 'binary',
type: 'string',
},
},
required: ['part'],
type: 'object',
}
Calling the endpoint with a binary file uploaded through the part
field does return a validation error which states: 'body.part must be string`
I expect that this validation error should not happen
node: 8.15.0
custom-plugin-lib: 4.3.2
os: 20.04.5 LTS (Focal Fossa), Ubuntu
Here are set the default variables required by the cpl
Lines 43 to 95 in b7dd0ea
For example, there is also the MICROSERVICE_GATEWAY_SERVICE_NAME
env var, also if microservice-gateway does not exists in project.
For my point of view, it's ok which are configurable but this env should not be required. Also, the MICROSERVICE_GATEWAY_SERVICE_NAME
should be optional and enable/disable the pre/post decorator function (raising an error).
Decorate the axios response with the duration of the response.
Due to the need to proxy various calls from a system to another it could be useful to decorate the axios response to include the duration of the response.
Here you can find an example of this implementation for the request-manager
.
It could be useful to use this value to log if the response doesn't respect a certain time limit or expose a metric to track the time of this kind of response.
node: v16.16.0
custom-plugin-lib: v5.1.1
os: Ubuntu 20.04.4 LTS
When a URL is passed to the getDirectServiceProxyFromUrlString
function, the created Service object does not contain the path of the passed URL as prefix.
The code snippet below shows the way in which the Service object is built:
return serviceBuilder(
completeUrl.hostname,
requestMiaHeaders,
{
protocol: completeUrl.protocol,
port: completeUrl.port,
...baseOptions,
})
Example: if the URL is http://example.com/path
, /path
is not considered when building the Service object.
The proxy obtained by getDirectServiceProxy can not handle empty responses with status code = 200, because return the following syntax error: SyntaxError: Unexpected end of JSON input
.
This error is thrown cause of automatic parse of response to JSON, in fact setting the returnAs option to BUFFER it's possible to go around the problem.
Workaround example:
const proxy = request.getDirectServiceProxy(BASE_URL)
const response = await proxy.post( `myURL`, body, querystring, { returnAs: 'BUFFER' })
Extend DecoratedRequest
TypeScript support to Querystring, Headers and Params too.
Right now the [DecoratedRequest interface](https://github.com/mia-platform/custom-plugin-lib/blob/9f20da345f208c391527c9c74e4bb7bcbc2955c2/index.d.ts#L60)
allows to specify the Generic only for the inner FastifyRequest body, meaning that if I can't specify the shape on any other request parameter (being Querystring, Headers or Params).
interface HelloQueryParameters {
who?: string
}
interface HelloRequest {
Querystring?: HelloQueryParameters
}
service.addRawCustomPlugin('GET', '/hello', async function (request:DecoratedRequest<HelloRequest>, reply:FastifyReply<any>) {
return { message: `Hello ${request.getUserId() || (request.query as HelloQueryParameters).who}` }
}, schema)
In the snippet above, request.query
is threated as unknown
service.addRawCustomPlugin('GET', '/hello', async function (request:DecoratedRequest<HelloRequest>, reply:FastifyReply<any>) {
return { message: `Hello ${request.getUserId() || request.query.who}` }
}, schema)
node: v12.17.0
custom-plugin-lib: @mia-platform/[email protected]
os: MacOS
It's quite common for developers to forget to implement some logs which are essentials to debug a service behavior and the integration with external systems. Some of this logs are always the same, and the developer should take them for granted, so that she can focus on logs for her specific domain.
Here is the proposal for some of this default (or boilerplate) logs:
addRawCustomPlugin
, the following logs should be set:Sensitive information should be take into account in order not to accidentally log it. A simple solution for this problem could consist of a simple flag opt-out
boolean flag to be set in the json schema of the endpoint for all the fields that should be omitted from the logs (like all the personal information
Here's an example of such a JSON schema:
{
"headers": {
"type": "object",
"properties": {
"authorization": {"type":"string", "hide": true"}
}
},
"body": {
"type": "object",
"properties": {
"name": {"type": "string"},
"surname": {"type": "string"},
"taxCode": {"type": "string", "hide": true"}
},
},
"response": {
"200": {
"type": "object",
"properties": {
"medicalRecordID": {"type": "string", "hide": "true"}
}
}
}
With these default/boilerplate logs, the developer should not remember to add noisy logs to her code, and she would know that it only takes to set the right LOG_LEVEL environment variable to have all the information she needs to debug a problem.
getServiceProxy does not return on 204 response and returnAs: 'BUFFER'
If the service is using a version of node >= 13.0.0 and if the getServiceProxy
is used to call a DELETE on the microservice-gateway
and the response is a 204 No Content
, the service get stuck.
The problem seams to be in the simple-concat
function called by simpleGet.concat
in simple-get/index.js
that got stuck on the never called data
and end
event.
simpleGet.concat = (opts, cb) => {
return simpleGet(opts, (err, res) => {
if (err) return cb(err)
concat(res, (err, data) => {
if (err) return cb(err)
if (opts.json) {
try {
data = JSON.parse(data.toString())
} catch (err) {
return cb(err, res, data)
}
}
cb(null, res, data)
})
})
}
simple-concat
module.exports = function (stream, cb) {
var chunks = []
stream.on('data', function (chunk) {
chunks.push(chunk)
})
stream.once('end', function () {
if (cb) cb(null, Buffer.concat(chunks))
cb = null
})
stream.once('error', function (err) {
if (cb) cb(err)
cb = null
})
}
Teorically it should be possible to reproduce the bug using this code snippet
const query = {
bucket: 'TestBucket1',
}
const customProxy = getServiceProxy(MICROSERVICE_GATEWAY_SERVICE_NAME, { port: 3001 })
nock('http://localhost:3001', { 'encodedQueryParams': true })
.delete('/media-storage/media-id', {})
.query({ 'bucket': 'TestBucket1' })
.reply(204, '', [
'date',
'Wed, 21 Apr 2021 16:40:26 GMT',
'content-type',
'application/json; charset=utf-8',
'content-length',
'4',
'Connection',
'close',
])
let result
try {
result = await customProxy.delete('/media-storage/media-id', {}, query, {
headers: {
miauserproperties: '{"permissions": ["MediaStorage.TestBucket1"]}',
},
returnAs: 'STREAM' })
} catch (error) {
console.log('error', error.message)
}
but actually i managed to reproduce the bug only calling the actual microservice gateway.
The called endpoint should correctly return a response
node: ^13.0.0 || ^14.0.0
custom-plugin-lib: 2.3.0
os: tested on k8s
It's very common to have the necessity to unit test a module which relies on a service proxy, which it receives at runtime with the method getDirectServiceProxy
, typically by dependency injection. In this case, the only way is the manually build a stub of the service proxy in order to do assertions about the calls to the proxed service.
It would be useful to have such a stub (builder?) available inside the lib and requirable by the unit test.
const {ServiceProxyStub} = require('@mia-platform/custom-plugin-lib')
const MyService = require('...')
const serviceStub = new ServiceProxyStub()
const myService = new MyService(serviceStub)
myServive.doStuff()
assertCallsToService(serviceStub)
The alternative is to make the getDirectServiceProxy
method available outside the context of the custom-plugin
so that the test can create an actual service proxy and then use libraries like nock
to intercept the http calls (it's still stubbing at the http level).
const nock = require('nock')
const {getDirectServiceProxy} = require('@mia-platform/custom-plugin-lib')
const MyService = require('...')
const service = getDirectServiceProxy('crud-service', {...})
const myService = new MyService(service)
setupInterceptors(nock)
myServive.doStuff()
assertInterceptors(nock)
In service calls, we should automatically forward the platform headers (like x-request-id
).
I suggest two possible way to handle this:
All this solutions, concat platform headers with headers set in ADDITIONAL_HEADERS_TO_PROXY
env var
In repo tests/fixture/keys
there are certificates (client.crt
, server.crt
, ca.crt
) that will expire on 3 September 3061.
So far it's not a problem, but it could be useful to keep track of this issue.
A possible solution is to generate new certificates every time tests are run or update this certificates before their expire date.
User should be able to use if-then-else, oneOf, anyOf at root level of json-schema.
Since here the library makes a trasformation of the json-schema user defined, any if-then-else
, oneOf
, anyOf
ecc the user used are gonna be lost and the applied json-schema to the environment variable will be something different from what user expected.
Pass the following json-schema to the library:
{
type: 'object',
required: [],
if: {
properties: {
FLAG: { type: 'boolean', const: true}
}
},
then: {
required: [
'BAR'
],
properties: {
FLAG: { type: 'boolean', const: true},
BAR: { type: 'string' }
}
},
else: {
required: [
'FOO'
],
properties: {
FLAG: { type: 'boolean', const: false},
FOO: { type: 'string' }
}
}
}
The resulting json-schema applied to the envs is gonna be this:
{
"type": "object",
"required": [
"USERID_HEADER_KEY",
"GROUPS_HEADER_KEY",
"CLIENTTYPE_HEADER_KEY",
"BACKOFFICE_HEADER_KEY",
"MICROSERVICE_GATEWAY_SERVICE_NAME"
],
"properties": {
"USERID_HEADER_KEY": {
"type": "string",
"description": "the header key to get the user id",
"minLength": 1
},
"USER_PROPERTIES_HEADER_KEY": {
"type": "string",
"description": "the header key to get the user permissions",
"minLength": 1,
"default": "miauserproperties"
},
"GROUPS_HEADER_KEY": {
"type": "string",
"description": "the header key to get the groups comma separated list",
"minLength": 1
},
"CLIENTTYPE_HEADER_KEY": {
"type": "string",
"description": "the header key to get the client type",
"minLength": 1
},
"BACKOFFICE_HEADER_KEY": {
"type": "string",
"description": "the header key to get if the request is from backoffice (any truly string is true!!!)",
"minLength": 1
},
"MICROSERVICE_GATEWAY_SERVICE_NAME": {
"type": "string",
"description": "the service name of the microservice gateway",
"pattern": "^(?=.{1,253}.?$)[a-z0-9](?:[a-z0-9-]{0,61}[a-z0-9])?(?:.[a-z0-9](?:[-0-9a-z]{0,61}[0-9a-z])?)*.?$"
},
"ADDITIONAL_HEADERS_TO_PROXY": {
"type": "string",
"default": "",
"description": "comma separated list of additional headers to proxy"
}
},
"additionalProperties": false
}
even though user defined something different.
The schema applied to the envs should be what users define. If adding baseEnvs is strictly necessary, then they should be merged properly based on the structure of the users defined schema.
The best practice, in my opinion, should be that the library exports the base schema and users should handle them by hand.
node: v14.19.3
custom-plugin-lib: v4.2.0
os: Ubuntu 20.04 LTS
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.