Serverless adapters for the universal runtime.
$ npm install @adobe/helix-universal
$ npm install
$ npm test
$ npm run lint
The base API for Universal Serverless Functions
License: Apache License 2.0
In:
helix-universal/src/aws-adapter.js
Lines 110 to 121 in 85e54e6
we could additionally check whether e
is of type SyntaxError
, which would indicate that JSON parsing of the request body failed in body-data-wrapper.js
(see adobe/helix-shared#593)
Potential downside: some genuine parsing problem elsewhere in the code might be treated as bad request.
for example ContentType
is included in the signature computation:
<?xml version="1.0" encoding="UTF-8"?>
<Error><Code>SignatureDoesNotMatch</Code><Message>The request signature we calculated does not match the signature you provided. Check your key and signing method.</Message><AWSAccessKeyId>AKIARXE2QZFCSXL7IA4R</AWSAccessKeyId><StringToSign>PUT
application/json
1622618896
/h3d4b1d4ea6d84b0229bce7cf6806b0bb3470489ab8205a13f75cfe518fa7/live/data-embed-unit-tests/data-embed-no-helix.json</StringToSign><SignatureProvided>01F7yGPBahNDioaR5A/AIfWn6Q0=<
suggest to include blob params like this:
class AWSStorage extends Storage {
static async presignURL(bucket, path, blobParams = {}, method = 'GET', expires = 60) {
if (!AWS) {
// eslint-disable-next-line global-require, import/no-extraneous-dependencies
AWS = require('aws-sdk');
AWS.config.update({
region: process.env.AWS_REGION || 'us-east-1',
logger: console,
});
}
const s3 = new AWS.S3();
const operation = method === 'PUT' ? 'putObject' : 'getObject';
const params = {
Bucket: bucket,
Key: path.startsWith('/') ? path.substring(1) : path,
Expires: expires,
...blobParams,
};
return s3.getSignedUrl(operation, params);
}
}
In AWS, one can create a trigger that invokes a Lambda function whenever a queue in Amazon SQS receives input. Technically, the function is invoked with a Records
array-type property in its event
argument, containing the messages available in the SQS queue.
In order to support this kind of invocation for a function that can also be called manually with parameters, it would be ideal if the AWS adapter would inspect that argument and fill the provided information into the request body.
We have a couple of actions now that need access to S3 or equivalent storage locations. Instead of creating a full-blown storage abstraction, what about starting with a minimal set of:
context.storage.presignURL(bucketname: string, path: string, method?: string = 'GET', expires?: number = 60)
: generates a pre-signed URL for the specified bucket and path. This URL can then be used in a helix-fetch
request to upload or download from storage.For Adobe I/O Runtime we may be able to use https://github.com/adobe/aio-lib-files which has support for pre-signed URLs, but that requires the presence of @adobe/aio-lib-files
in the container image.
For all others, there are libraries that can even take the credentials from the container, so we should be covered.
the binary check only happens on the content-type:
const isBase64Encoded = isBinary(response.headers.get('content-type'));
but it should also respect the content-encoding
.
In Working with parameters it is shown that one can invoke an OpenWhisk action and pass JSON as parameter directly:
wsk action invoke --result hello -p person '{"name": "Dorothy", "place": "Kansas"}'
Passing this to a universal action ends up with a wrong parameter value:
person: [object Object]
Reason is:
helix-universal/src/openwhisk-adapter.js
Lines 65 to 71 in 20e63a4
where parameters are added as search parameters to the request URL regardless of their type.
A concrete example where this is used is in helix-index-files
, where an observation message is passed via trigger as a JSON object to the action.
Right now using universal in Typescript means you need to manually set main()
's parameter types. This is expected, not much we can do about it, but the pain point is when you're using wrappers that extend the context.
This works great:
import { UniversalContext, Request } from '@adobe/helix-universal';
export function main (req: Request, ctx: UniversalContext) {
...
}
This doesn't:
import { UniversalContext, Request } from '@adobe/helix-universal';
import wrap from '@adobe/helix-shared-wrap';
import { logger } from '@adobe/helix-universal-logger';
function _main(req: Request, ctx: UniversalContext) {
ctx.log.info("woo"); // Type error, ctx.log doesn't exist
}
export const main = wrap(_main).with(logger);
Ignoring the last line which has its own type errors, the context doesn't "know" about the logger wrapper. Workaround could be to import each context and make a custom interface merging them, but that would be repeated every time it's used.
The context
APIs seem pretty useful to expose, so I'm proposing a namespace that would be exposed, allowing wrapper functions to extend the context as they need.
Assuming @adobe/helix-universal-logger
adds an extension to the namespace like this:
// index.d.ts
declare module '@adobe/helix-universal' {
namespace HelixUniversal {
export interface UniversalContext {
info: (...msgs: any[]) => void;
// ...
}
}
}
TS clients could then use:
import { HelixUniversal, Request } from '@adobe/helix-universal';
import wrap from '@adobe/helix-shared-wrap';
import { logger } from '@adobe/helix-universal-logger';
async function _main(req: Request, ctx: HelixUniversal.UniversalContext) {
ctx.log.info("woo"); // ๐
}
export const main = wrap(_main).with(logger);
wdyt @tripodsan @trieloff ?
Is your feature request related to a problem? Please describe.
The way that helix-universal detects that the response should be returned as-is is based on the absence of event.requestContext
(see
helix-universal/src/aws-adapter.js
Line 87 in 5eb09fc
When used as an Authorizer, event.requestContext
is set. But the response needs to be returned in the raw form provided by helix-universal when nonHttp
is true
.
Describe the solution you'd like
Allow functions to explicitly ask for non-HTTP mode to be used in the response by setting a header, e.g. force-non-http: true
Describe alternatives you've considered
Not using helix-universal for this use case but that would create other problems.
Additional context
Add any other context or screenshots about the feature request here.
Description
with adobe/fetch#309 fixed, the multivalue headers like set-cookie
now are arrays and need to handled differently with AWS responses.
see: https://docs.aws.amazon.com/lambda/latest/dg/services-apigateway.html#apigateway-types-transforms
instead of reading from the parameter store, using getParametersByPath()
which has a very low TPS (100),
all the parameters should be bundled up (json) and fetch from the secrets manager, which has a 5000 TPS limit.
helix-universal/src/aws-adapter.js
Line 161 in 15ba0ac
This can fail when the response is not created by @adobe/fetch
, instead fall back to built-in functions like https://developer.mozilla.org/en-US/docs/Web/API/Headers/entries
When invoking the latest version of an AWS function, the functionAlias
in the ARN is undefined
:
helix-universal/src/aws-adapter.js
Lines 44 to 52 in 6a4cd51
Further down, a replace on that functionAlias
is attempted:
helix-universal/src/aws-adapter.js
Lines 65 to 71 in 6a4cd51
which causes a TypeError
.
it seems that the cookie header is lost / not set in the request.
it should be populated using the event.cookies
object:
When preparing the request:
helix-universal/src/aws-adapter.js
Lines 100 to 104 in a541ba4
the code should check whether the HTTP method supports non-empty body at all, and reject it with a 400 otherwise - currently, it reports a 500.
the process.env
are set before invoking the function, which is a problem because the main
module is imported before that. so any global checks that depend on the process.env
will fails.
eg:
// use HTTP1 if we run serverless. otherwise the open http/2 connections might hang the process.
process.env.HELIX_FETCH_FORCE_HTTP1 = process.env.HELIX_UNIVERSAL_RUNTIME;
Suggestion
main
after the process.env is set.Description
A rate limit error while fetching the parameters should either:
2021-04-14T09:01:36.298Z 7a3a96f0-350b-435a-8f04-5923ebfe4972 ERROR unable to get parameters ThrottlingException: Rate exceeded
at Request.extractError (/var/runtime/node_modules/aws-sdk/lib/protocol/json.js:52:27)
at Request.callListeners (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:106:20)
at Request.emit (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:78:10)
at Request.emit (/var/runtime/node_modules/aws-sdk/lib/request.js:688:14)
at Request.transition (/var/runtime/node_modules/aws-sdk/lib/request.js:22:10)
at AcceptorStateMachine.runTo (/var/runtime/node_modules/aws-sdk/lib/state_machine.js:14:12)
at /var/runtime/node_modules/aws-sdk/lib/state_machine.js:26:10
at Request.<anonymous> (/var/runtime/node_modules/aws-sdk/lib/request.js:38:9)
at Request.<anonymous> (/var/runtime/node_modules/aws-sdk/lib/request.js:690:12)
at Request.callListeners (/var/runtime/node_modules/aws-sdk/lib/sequential_executor.js:116:18) {
code: 'ThrottlingException',
time: 2021-04-14T09:01:36.297Z,
requestId: '0014473a-2458-445e-a5f7-3ddca2dff603',
statusCode: 400,
retryable: true
}
With the wrapper and universal gateway in place, we can now support large responses in the following way:
Location
pointing to the stored response bodyRESTART
s and delivers the body from the Location
The response cleanup could also be done by the wrapper in the next request.
This issue lists Renovate updates and detected dependencies. Read the Dependency Dashboard docs to learn more.
These updates are awaiting their schedule. Click on a checkbox to get an update now.
@google-cloud/storage
, aws-sdk
).github/workflows/main.yaml
actions/checkout v4
actions/setup-node v4
codecov/codecov-action v4
actions/checkout v4
actions/setup-node v4
.github/workflows/semver-check.yaml
package.json
@adobe/fetch 4.1.2
aws4 1.12.0
@adobe/eslint-config-helix 2.0.6
@google-cloud/secret-manager 5.5.0
@google-cloud/storage 7.10.2
@semantic-release/changelog 6.0.3
@semantic-release/git 10.0.1
aws-sdk 2.1613.0
c8 9.1.0
eslint 8.57.0
esmock 2.6.5
husky 9.0.11
junit-report-builder 3.2.1
lint-staged 15.2.2
mocha 10.4.0
mocha-multi-reporters ^1.5.1
nock 13.5.4
semantic-release 23.0.8
Description
in order to reduce dependencies, it would be better to support some kind of pluggable mechanism to install handers.
eg, using a custom adapter:
import { adapter, awsSecrets } from '@adobe/helix-universal';
import { awsEpsagon } from `@adobe/helix-epsagon`
export const lambda = adapter.aws.with(awsEpsagon).with(awsSecrets);
export const main = adapter.openwhisk.raw;
export const google = adapter.google.raw;
having the process.env
variables mixed into the context.env
object is kind of redundant and might be confusing.
context.env
should only contain the information that is specific to the deployed environment.
an action can still use process.env
explicitely.
When I try invoking my hedy
deployed function in the AWS console, I get a TypeError indicating that some information is missing to build the context passed to the universal main, e.g. the event.requestContext
:
helix-universal/src/aws-adapter.js
Line 31 in 3318583
I understand that it is not possible to create a resolver without a requestContext
, but in order to easily build an AWS trigger for a function that does not require the resolver, it would be ideal if the missing information is not provided, and using a resolver would fail in that particular situation.
On the other hand, I can also try to create an AWS trigger and feed the expected JSON into the event payload.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.