Code Monkey home page Code Monkey logo

external-adapters-js's Issues

Blockcypher request rate limiting

Description

The Blockcypher balance adapter is blocked because their rate limit per second, even on their most premium plan, will not work with wbtc's ~30 custodial wallet requests.

One solution could be to allow a new environment variable for RATE_LIMIT = 10, where the number given will be the number of requests sent per second.

This would unlock the wbtc proof-of-reserves adapter and allow node operators to use a cheaper plan.


The above has been given a naive solution, but a more re-usable function could be added to utils.

References

#276

Validation error in "unknown endpoint"' of behaviors

On test-helpers' balance behavior, unknown endpoint should not result in validation error, since a non-valid endpoint for balance behavior might make sense in another adapter's behavior, as it happens on cryptoapis.

Non-price feed error throws an error when queried due to metrics checking for price feed parameters

Non-blocking error is thrown each time a non-price feed adapter is queried.

{"error":{"jobRunID":"1","status":"errored","statusCode":400,"name":"AdapterError"},"level":"error","message":"Error validating input.","instanceId":"bd50ff51-dcc2-459b-924c-41a7e8f8783a","timestamp":"2021-04-07T14:20:40.029Z"}
{"message":"Unable to validate feed name","level":"debug","instanceId":"bd50ff51-dcc2-459b-924c-41a7e8f8783a","timestamp":"2021-04-07T14:20:40.029Z"}

The parameters are validated as part of the middleware/metrics system in the /bootstrap/src/lib/metrics/util.ts file and runs the Validator which is throwing the first line of the error above.

/**
 * Get feed id name based on input params
 * @param input The adapter input request
 * @returns {string}
 */
export const getFeedId = (input: AdapterRequest): string => {
  const commonFeedParams = {
    base: ['base', 'from', 'coin', 'symbol', 'asset'],
    quote: ['quote', 'to', 'convert'],
  }
  try {
    const validator = new Validator(input, commonFeedParams)
    if (validator.error) {
      logger.debug('Unable to validate feed name')
      return JSON.stringify(input)
    }
    return `${validator.validated.data.base.toUpperCase()}/${validator.validated.data.quote.toUpperCase()}`
  } catch (e) {
    return JSON.stringify(input)
  }
}

Can be confusing to operators to see an error when running non-price feed EAs.

Adapters should automatically export their name

The discussion about this started here.

This was first used as an ENV prefix to avoid naming collisions when using two or more adapters in the same env. So for now it was a simple hardcoded uppercase string.

What we should do now to make this more robust is this:

  • Import the name from package.json, remove the -adapter suffix, and export that as NAME.
  • Create a toEnvName util function that will map it to a safe string to use as ENV prefix to avoid naming collisions.

Environment variable names used by the utilities in the Shell and Utilities volume of IEEE Std 1003.1-2001 consist solely of uppercase letters, digits, and the '_' (underscore) from the characters defined in Portable Character Set and do not begin with a digit.

Make e2e integration tests more usable by injecting secrets

Integration tests, like they are currently set up, are not too usable. We would like to start integration tests for all adapters at once, but how do we then inject a different API key to each one?

One possible solution would be to have a secrets.json configured with all secrets required for integration tests. Then every test can read from there.

For CI runs we could have an encrypted secrets.json.gpg file that gets decrypted and provided to the test suite. Documentation on this workflow can be found here.

Originally posted by @krebernisak in #218 (comment)

Make build failing

After installing using

yarn

at the top level, I ran make build coingecko and am getting

make build coingecko
yarn ncc build  -o /dist
yarn run v1.22.4
$ /Users/kjd/go/src/github.com/smartcontractkit/external-adapters-js/node_modules/.bin/ncc build -o /dist
{ Error: Cannot find module '/Users/kjd/go/src/github.com/smartcontractkit/external-adapters-js'
    at Function.Module._resolveFilename (internal/modules/cjs/loader.js:636:15)
    at resolve (internal/modules/cjs/helpers.js:33:19)
    at runCmd (evalmachine.<anonymous>:1:47259)
    at Object.819 (evalmachine.<anonymous>:1:44241)
    at __webpack_require__ (evalmachine.<anonymous>:1:216)
    at startup (evalmachine.<anonymous>:1:353)
    at module.exports.8 (evalmachine.<anonymous>:1:385)
    at evalmachine.<anonymous>:1:395
    at Object.<anonymous> (/Users/kjd/go/src/github.com/smartcontractkit/external-adapters-js/node_modules/@zeit/ncc/dist/ncc/cli.js:6:28)
    at Module._compile (internal/modules/cjs/loader.js:778:30) code: 'MODULE_NOT_FOUND' }
error Command failed with exit code 1.
info Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.
make: *** [build] Error 1

Seems to be happening for all the adapters I tried.

Improve adapter input validation and documentation

This idea started here by @yosriady.

As validation is also a big part of external-adapters-js, which has a simple Validator we currently use, it would be great if we could combine our efforts here and define a simple validation utility that we would use for both.

I like the idea of defining a declarative validation strategy, one that we could also use to add extra documentation to external adapters. Either printing it to logs or adding a special endpoint that would return the input schema for documentation and debug purposes. But just reading it from source code as-is would be beneficial for someone trying to deploy and use those adapters. Better and more specific error messages would also bring a much improved UX.

If we enable to easily write custom validation functions, more would be written, which would also further improve code safety.

Improve EA error handling

My thoughts are that we should handle most/all of errors, output them with correct status codes and appropriate messages. We have work to do here.

Internal errors should be caught, mapped to a HTTP 500 status code and construct an appropriate message. Something like Internal Server Error: ${error.name}. Then if we are in dev env (NODE_ENV=development) or some other verbose/debug flag is set, we dump the whole internal error on the output, as an underlying cause to be investigated.

Originally posted by @krebernisak in #212 (comment)

401 coinapi won't work

Test:

cd https://github.com/smartcontractkit/external-adapters-js/tree/master/coinapi
npm i
API_KEY=xyz npm test

Result:

{"message":"Could not reach endpoint: \"Request failed with status code 401\"","level":"error","instanceId":"1bc8f1c1-592c-42af-b217-ac3ff7c1d0c9","timestamp":"2020-10-27T05:37:12.259Z"}

Migrate @chainlink/ea-bootstrap to 100% async

Right now there is a divide in the codebase between an older style of adapters that use synchronous code callbacks and the newer style that using awaiting promises.

This can be seen here.

All code should be migrated away from using ExecuteWrappedResponse.

Improve CI jobs runtime

Currently, CI jobs for a PR branch could run in excess of 45 min. This is too much waiting time and slows down the normal development process.

Some ideas on how to speed up CI runs:

  • make all adapters using one cmd (a lot of time is spent waiting to boot up CI runners)

Support batching for PoR adapters where available

  • Blockcypher

Unfortunately, this is only useful to improve latency:

With regard to rate limits, each individual batch call counts as a request; for example, if you request 3 addresses in a batch, you're still using 3 API calls of resources on our end. The big advantage to batching is that you avoid 3 separate round-trip calls/reduce latency. Since the default, non-registered rate limit per second is 3, larger batches require a paid API token. To use larger batches please register.

Coingecko EA not working properly with the cache

Typically, when a job hits the api directly, the response looks like this:

{"jobRunID":"1","result":0.02570336,"statusCode":200,"data":{"result":0.02570336}}

And when it hits the cache, a successful response looks something like this:

{"result":0.025649,"maxAge":60000,"statusCode":200,"data":{"status":200,"title":"OK","description":"Successful request","payload":{"timestamp":1615461840000,"pair":"bal_eth","price":"0.025649","volume":"0.95"},"result":0.025649}}

For coingecko, we get this:

{"maxAge":60000,"statusCode":200,"data":{"result":0.025649}}

The jobspecs are looking for a result field at the top level. Can we please add it so that all caches send the result to the top level?

Unable to publish NPM packages using Github Actions

I've been following GitHub Docs | Publishing Node.js packages and various open issues on actions/setup-node, but I can't seem to get it to work for the life of me.

The most common error for NPM publish:

Run yarn workspace @chainlink/external-adapter publish --non-interactive --access public
yarn workspace v1.22.5
yarn publish v1.22.5
[1/4] Bumping version...
info Current version: 0.2.4
[2/4] Logging in...
[3/4] Publishing...
error Couldn't publish package: "https://registry.npmjs.org/@chainlink%2fexternal-adapter: Not found"
info Visit https://yarnpkg.com/en/docs/cli/publish for documentation about this command.
error Command failed.
Exit code: 1
Command: /opt/hostedtoolcache/node/12.18.3/x64/bin/node
Arguments: /usr/share/yarn/lib/cli.js publish --non-interactive --access public
Directory: /home/runner/work/external-adapters-js/external-adapters-js/external-adapter
Output:

info Visit https://yarnpkg.com/en/docs/cli/workspace for documentation about this command.
##[error]Process completed with exit code 1.

The most common error for GitHub Packages publish:

[1/4] Bumping version...
info Current version: 0.2.14
[2/4] Logging in...
verbose 0.295776152 Error: No token found and can't prompt for login when running with --non-interactive.
    at /home/runner/work/external-adapters-js/external-adapters-js/.yarn/releases/yarn-1.22.5.cjs:35203:13
    at Generator.next (<anonymous>)
    at step (/home/runner/work/external-adapters-js/external-adapters-js/.yarn/releases/yarn-1.22.5.cjs:310:30)
    at /home/runner/work/external-adapters-js/external-adapters-js/.yarn/releases/yarn-1.22.5.cjs:328:14
    at new Promise (<anonymous>)
    at new F (/home/runner/work/external-adapters-js/external-adapters-js/.yarn/releases/yarn-1.22.5.cjs:25783:28)
    at /home/runner/work/external-adapters-js/external-adapters-js/.yarn/releases/yarn-1.22.5.cjs:307:12
    at getToken (/home/runner/work/external-adapters-js/external-adapters-js/.yarn/releases/yarn-1.22.5.cjs:35270:18)
    at Object.<anonymous> (/home/runner/work/external-adapters-js/external-adapters-js/.yarn/releases/yarn-1.22.5.cjs:98537:65)
    at Generator.next (<anonymous>)
error No token found and can't prompt for login when running with --non-interactive.
info Visit https://yarnpkg.com/en/docs/cli/publish for documentation about this command.
error Command failed.
Exit code: 1
Command: /opt/hostedtoolcache/node/12.18.3/x64/bin/node
Arguments: /home/runner/work/external-adapters-js/external-adapters-js/.yarn/releases/yarn-1.22.5.cjs publish --verbose --non-interactive --access public
Directory: /home/runner/work/external-adapters-js/external-adapters-js/external-adapter
Output:

info Visit https://yarnpkg.com/en/docs/cli/workspace for documentation about this command.
##[error]Process completed with exit code 1.

Could not load typescript compiler with NPM package name

On zipping files for serverless consumption:

make zip adapter=coinpaprika

I get the following error:

Error: Could not load TypeScript compiler with NPM package name `/home/patrick/.npm/_npx/3340692/lib/node_modules/@vercel/ncc/dist/ncc/typescript.js`. Are you sure it is correctly installed?

I've tried changing devDependancies and running npm link typescript, such as suggested in this issue: vercel/ncc#622

I've attemped on:
OS: Ubuntu 20.04.1 LTS, & MacOS Big Sur 11.1
Node: 10.19.0, 12.18.4, 14.15.2 (and various npm versions associated)

Any thoughts?

Error: Chromium revision is not downloaded.

Steps to reproduce

  • Switch on Chainlink VPN.
  • curl https://lambda.staging.devnet.tools/googlefinance/call -d '{"data": {"from": "FTSE", "to": "GBP" }}'.

Expected outcome

Status code 200 with a response object containing result field.

Actual outcome

{"jobRunID":"1","status":"errored","error":{"error":{"name":"AdapterError","message":"Chromium revision is not downloaded. Run \"npm install\" or \"yarn install\""}},"statusCode":500}%

GCP nodejs v10 or 12?

It says that for the GCP you should use nodejs v 10, but for AWS lamda use v12. Should they both be 12?

Docker run cannot read from a file

While acceptance testing adapters the command
docker run -p 8080:8080 --env-file=PATH -it proof-of-reserves-adapter:latest
was unable to be used to start a docker container with the correct environment variables.

Having this work from would be a lot easier for node operators who needs to start 5+ containers to run composite adapters.

Circular dependency between bootstrap and external-adapter

We currently use shorthand ambient modules in our @chainlink/types:

declare module '@chainlink/external-adapter
declare module '@chainlink/ea-bootstrap

These stifle types by setting every export to be any, defeating our type safety.
We are unable to remove them because of a circular dependency between these two packages.

b847ff2

Is a band-aid to allow CI to run with this issue. It should be removed.

Secrets module

We can't commit API keys to the public repository.

But, you are right that integration tests set like this are not too usable. We want to somehow start integration tests for all adapters at once, but then how do we inject a different API key to each.

One possible solution would be to have a secrets.json configured with all secrets required for integration tests. Then every test can read from there.

Originally posted by @krebernisak in #218 (comment)

API keys from Config

There are a couple places in the codebase where API keys are accessing from within an endpoint's code. These instead need to be handled in makeConfig.

This change will interfere with validation tests. These tests currently do not have any API keys. They could be passed into tests through the Config.

Originally posted by @justinkaseman in #226 (comment)

Balance endpoints incorrectly parse units

Description

Balance endpoints originally were created for Bitcoin Proof of Reserves.
The External Adapters were engineered to be able to handle more than just Bitcoin.
In some cases the balance units needs to be converted.

A recent PR hard coded conversion to BTC, when it could be multiple currencies.
A better utility function should be created or support for other currencies should be disabled.

This issue can be re-created by running yarn test in cryptoapis adapter.
Ethereum chains will throw this error trying to convert the units:

Error: fractional component exceeds decimals (fault="underflow", operation="parseFixed", code=NUMERIC_FAULT, version=bignumber/5.0.14)

TS export = disables us to export types

We should migrate for the way we currently export form adapters:

// only values can be exported here
export = { NAME, /* types */, makeExecute, makeConfig, ...expose(util.wrapExecute(makeExecute())) }

... to something like this:

const handlers = expose(util.wrapExecute(makeExecute()))

export { NAME, types, makeExecute, makeConfig, handlers }

Improve support for multiple keys in API_KEY env variable

There are still some places in the codebase that use util.getRequiredEnv('API_KEY') instead of getRandomRequiredEnv function.

We'll need to update those and also make sure that we add an ability to specifically require the API key when using the Requester.getDefaultConfig function, as some adapters use it but don't require an API key set, for example the dydx-stark adapter.

Make integration tests more usable by mocking HTTP requests

Instead of using Requester static functions to make HTTP requests, we would like to rewrite this functionality as an HTTP module object that gets injected into the adapter, and as such can be easily mocked in tests.

This will enable easier tests, better test coverage, and running more tests in the CI environment, increasing the overall codebase safety.

[FEATURE] Serverless Framework and OpenWhisk Integration

OpenWhisk is an opensource FaaS solution that can easily be deployed on Kubernetes. I'd like to make it possible to deploy all of the adapters on OpenWhisk with the help of Serverless Framework.

This way, deployment of new adapters can be as easy as sls deploy

Improve documentation on PoR & composite adapters in general

Currently, it's not straightforward for node operators to understand quickly how to build & run composite adapters like Proof of Reserve. Composite adapters have underlying adapters they use of which each has some kind of configuration and/or defaults. All of this info is currently obscure and hard to get to.

As suggested by @tyrion70:

Ideally the README has everything needed for a node operator to get it started together with the jobspecs of course

Investigate test timeouts

I believe we need a timeout to give error tests time to retry. This does need more investigation though.

The problem I have noticed is that different APIs take different amount of times to timeout. In the example of 1forge I had to increase it this much to pass, where as other adapters could take lower amounts (usually 3000 to allow three 1 second retries).

Originally posted by @justinkaseman in #226 (comment)

Refactor getting environment variables

Maybe we could avoid some duplication with the other method? Like passing a boolean value e.g = (name: string, required = true, delimiter = ',', prefix = '') and use util.getRandomEnv('API_KEY') in all places.

Then we can call util.getRandomEnv('API_KEY', false) in places that we don't want the var to be required.

Or something similar.

Originally posted by @ebarakos in #226 (comment)

Log adapter errors

Currently if there is some internal error, it just get notified to the client with a 500 without logging anything. Logging this would be really helpful for debugging and recognize faster possible problems in production adapters

Sequences processing semantics [Draft]

Some of our adapters like the PoR bitcoin indexer adapters process sequenced data.

Adapters that process sequences should only change items that they know how to process. If they don’t have the ability to process them they should add a warning. Maybe the next adapter in the chain will have the know-how to process those items and possibly remove the warning.

If they have the ability to process the item, but the item is found to be invalid, the error should be thrown as there is no way to continue processing this sequence. For example, if we have an adapter that processes bitcoin/mainnet addresses. That adapter should add warnings for all bitcoin/testnet and ethereum/mainnet addresses. But if it finds an invalid bitcoin/mainnet address, then it throws an error!

We can additionally add a strict configuration flag, which we can use to configure the sequence processing adapter with: “No warnings, all errors.” Or in other words, you need to be able to process everything or nothing.

TODO:

  • How sequence processing semantic should work
  • How we should be able to chain different adapters with different processing abilities
  • How error handling should work
  • How can we extract this in a framework to easily add sequence processing abilities to an adapter

Add API's response.data on the data field of adapter's response

I have noticed on some of the JS adapters, we were returning the actual API's response.data field.

We might need to alter the example TS adapter, so on the simple/standard cases, this field is appended to the data object, to maintain backwards compatibility.

Current:

  const response = await Requester.request(reqConfig, customError)
  const result = Requester.validateResultNumber(response.data, ['price'])
  return Requester.success(jobRunID, {
    data: { result },
    result,
    status: 200,
  })

Suggestion:

  const response = await Requester.request(reqConfig, customError)
  const result = Requester.validateResultNumber(response.data, ['price'])
  return Requester.success(jobRunID, {
    data: { ...response.data, result },
    result,
    status: 200,
  })

Config improvements

Description

Re-usable Configs were recently added to reduce shared code between EAs. This was a simple first-step and should be improved upon.

There was concern about them being too static. A generic type was added to ExecuteFactory to allow many Config types. This concern should be given more thought for the future.

These configurations should be extendable and be able to come from a variety of sources (URL params, ENV variables, argument to factory, ect.)

The compostability of these configurations in their current form should be verified.

Settle on a single Big number library

It seems we currently use all big number libraries out there: ethers.BigNumber , decimal.js , bignumber.js , bn.js and big.js.

We should settle on a single library and use that globally throughout the repo.

Refactor CMC price endpoint & add tests

Seems good, left only one potential change.

The price endpoint though is quite complicated. We might need to add various test cases regarding slug/cid + to do some good acceptance testing. There is probably room for refactoring, either now or at some point, once we cover all cases with tests.

Agreed that there is probably room for refactoring and additional tests, but for the sake of staying in the scope of the TS refactor and hitting this release I think we should instead open a new ticket for that.

Opening one now for CMC price endpoint refactor.

Originally posted by @justinkaseman in #218 (comment)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.