Code Monkey home page Code Monkey logo

eosio-contract-api's Introduction

EOSIO Contract API

The aim of this project is to provide a framework to fill and query state and history for specific contracts on eosio based blockchains.

This project uses the eosio State History Plugin as data source and PostgreSQL to store / query the data. Per block transactions guarantee that the database is consistent at any time.

Requirements

  • NodeJS >= 16.0
  • PostgreSQL >= 13.0
    • You need to enable the pg_trgm extension with CREATE EXTENSION pg_trgm;
  • Redis >= 5.0
  • Nodeos >= 1.8.0 (only tested with 2.0 and 2.1) The state history plugin needs to be enabled and the options: trace-history = true, chain-state-history = true

Suggestions

  • Hasura GraphQL Engine >= 1.3 (if you want to allow GraphQL queries)
  • PGAdmin 4 (Interface to manage the postgres database)

Configuration

The config folder contains 3 different configuration files

connections.config.json

This file contains Postgres / Redis / Nodeos connection data for the used chain.

Notes

  • Redis: Can be used for multiple chains without further action
  • PostgreSQL: Each chain needs it own postgres database (can use the same postgres instance), but multiple readers of the same chain can use the same database if they are non conflicting
  • Nodeos: nodeos should habe a full state history for the range you are trying to index
{
  "postgres": {
    "host": "127.0.0.1",
    "port": 5432,
    "user": "username",
    "password": "changeme",
    "database": "api-wax-mainnet-atomic-1"
  },
  "redis": {
    "host": "127.0.0.1",
    "port": 6379
  },
  "chain": {
    "name": "wax-mainnet",
    "chain_id": "1064487b3cd1a897ce03ae5b6a865651747e2e152090f99c1d19d44e01aea5a4",
    "http": "http://127.0.0.1:8888",
    "ship": "ws://127.0.0.1:8080"
  }
}

readers.config.json

This file is used to configure the filler

For atomicassets / atomicmarket you should specify the following start blocks

  • wax-mainnet: 64000000
  • wax-testnet: 35795440 (Here you need to use it otherwise it will break)
  • eos-mainnet: 99070000
  • proton-mainnet: 50289000
  • proton-testnet: 53440000
[
  // Multiple Readers can be defined and each one will run in a separated thread
  {
    "name": "atomic-1", // Name of the reader. Should be unique per chain and should not change after it was started

    "start_block": 0, // start at a specific block. If ready was already started, this can only be higher than the last indexed block
    "stop_block": 0, // stop at a specific block
    "irreversible_only": false, // If you need data for a lot of contracts and do not need live data, this option is faster

    "ship_prefetch_blocks": 50, // How many unconfirmed blocks ship will send
    "ship_min_block_confirmation": 30, // After how many blocks the reader will confirm the blocks
    "ship_ds_queue_size": 20, // how many blocks the reader should pre-serialize the action / table data
      
    "ds_ship_threads": 4, // How many threads should be used to deserialize traces and table deltas

    "db_group_blocks": 10, // In catchup mode, the reader will group this amount of bl

    "contracts": [
      // AtomicAssets handler which provides data for the AtomicAssets NFT standard
      {
        "handler": "atomicassets",
        "args": {
          "atomicassets_account": "atomicassets", // Account where the contract is deployed
          "store_logs": true, // store logs
          "store_transfers": true // store the transfer history
        }
      }
    ],
    
    "list_polls": [ // optional
      {
        url: 'https://example.com/lists', // endpoint for the lists
        api_key: '123', // send as X-API-Key header to the server
        frequency: 600, // optional, poll frequency in seconds, defaults to 10 minutes
      }
    ]
  }
]

server.config.json

{
  "provider_name": "pink.network", // Provider which is show in the endpoint documentation
  "provider_url": "https://pink.network",

  "server_addr": "0.0.0.0", // Server address to bind to
  "server_name": "wax.api.atomicassets.io", // Server name which is shown in the documentation
  "server_port": 9000, // Server Port

  "cache_life": 2, // GET endpoints are cached for this amount of time (in seconds)
  "trust_proxy": true, // Enable if you use a reverse proxy to have correct rate limiting by ip

  "rate_limit": {
    "interval": 60, // Interval to reset the counter (in seconds)
    "requests": 240 // How much requests can be made in the defined interval
  },
    
  "ip_whitelist": [], // These IPs are not rate limited or receive cached requests
  "slow_query_threshold": 7500, // If specific queries take longer than this threshold a warning is created

  "max_query_time_ms": 10000, // max execution time for a database query
  "max_db_connections": 50, // max number of concurrent db connections / db queries
        
  "namespaces": [
    // atomicassets namespace which provides an API for basic functionalities
    {
      "name": "atomicassets", 
      "path": "/atomicassets", // Each API endpoint will start with this path
      "args": {
        "atomicassets_account": "atomicassets" // Account where the contract is deployed
      }
    }
  ]
}

Installation

This project consists of two separated processes which need to be started and stopped independently:

  • The API which will provide the socket and REST endpoints (or whatever is used)
  • The filler which will read the data from the blockchain and fills the database

The filler needs to be started before the API when running it for the first time:

Prerequisites:

  • PostgreSQL

    • Create a database and user which is allowed to read and write on that db
  • EOSIO node

    • State History Plugin enabled with options trace-history = true, chain-state-history = true
    • Fully synced for the block range you want to process
    • Open socket and http api
  • Copy and modify example configs with the correct connection params

There are two suggested ways to run the project: Docker if you want to containerize the application or PM2 if you want to run it on system level

Docker

  1. git clone && cd eosio-contract-api
  2. There is an example docker compose file provided
  3. docker-compose up -d

Start

  • docker-compose start eosio-contract-api-filler
  • docker-compose start eosio-contract-api-server

Stop

  • docker-compose stop eosio-contract-api-filler
  • docker-compose stop eosio-contract-api-server

PM2

  1. git clone && cd eosio-contract-api
  2. yarn install
  3. yarn global add pm2

Start

  • pm2 start ecosystems.config.json --only eosio-contract-api-filler
  • pm2 start ecosystems.config.json --only eosio-contract-api-server

Stop

  • pm2 stop eosio-contract-api-filler
  • pm2 stop eosio-contract-api-server

Currently Supported Contracts

Readers (used to fill the database)

Readers are used to fill the database for a specific contract.

atomicassets

{
  "handler": "atomicassets",
  "args": {
    "atomicassets_account": "atomicassets", // account where the atomicassets contract is deployed
    "store_transfers": true, // store the transfer history  
    "store_logs": true // store data structure logs
  }
}

atomicmarket

This reader requires a atomicassets and a delphioracle reader with the same contract as specified here

{
  "handler": "atomicmarket",
  "args": {
    "atomicassets_account": "atomicassets", // account where the atomicassets contract is deployed
    "atomicmarket_account": "atomicmarket", // account where the atomicmarket contract is deployed
    "store_logs": true // Store logs of sales / auctions
  }
}

delphioracle

{
  "handler": "delphioracle",
  "args": {
    "delphioracle_account": "delphioracle" // account where the delphioracle contract is deployed
  }
}

Namespace (API endpoints)

A namespace provides an API for a specific contract or use case and is based on data a reader provides

atomicassets

{
  "handler": "atomicassets",
  "args": {
    "atomicassets_account": "atomicassets", // account where the atomicassets contract is deployed
    "connected_reader": "atomic-1" // reader to which the API connects for live data
  }
}

atomicmarket

{
  "handler": "atomicmarket",
  "args": {
    "atomicmarket_account": "atomicmarket", // account where the atomicmarket contract is deployed
    "connected_reader": "atomic-1" // reader to which the API connects for live data
  }
}

Testing

To run the test on this project:

  1. In the config folder, copy the example-* files and rename them to connections.config.json readers.config.json and server.config.json
  2. Start the redis and postgresql servers, you can do that by using docker compose (docker compose up eosio-contract-api-redis eosio-contract-api-postgres) or installing them directly into your computer.
  3. Modify the connection.config.json to point to your local computer and have the correct credentials.
  4. Execute the init-test-db script using the command yarn dev:init-test-db
  5. Run the test using the command yarn dev:test

Developing new features

  1. Make sure that everything in our code base is working properly, ej: Test running and application compiling.
  2. Depending on your way of working, develop the feature or the bug fix.
  3. Add some test cases to cover up your code.
  4. Test it manually in using the API.
  5. Create a PR and fix the comments of the reviewer.
  6. Merge and deploy.

eosio-contract-api's People

Contributors

andreme avatar dependabot[bot] avatar devald avatar fabian-emilius avatar frontierpsychiatrist avatar hugomspielworks avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

eosio-contract-api's Issues

[Bug] Failed to execute migration scripts relation "atomicmarket_stats_prices_master"

I'm getting this error when upgrading from v1.3.11 to latest version

0|eosio-contract-api-filler | 2022-10-22T09:07:38.454Z [PID:33481] [info] : Process stopping - cleaning up transactions...
0|eosio-contract-api-filler | 2022-10-22T09:07:38.454Z [PID:33481] [info] : All transactions aborted
0|eosio-contract-api-filler | 2022-10-22T09:07:38.916Z [PID:33503] [info] : Starting workers...
0|eosio-contract-api-filler | 2022-10-22T09:07:39.032Z [PID:33503] [info] : Checking for available upgrades...
0|eosio-contract-api-filler | 2022-10-22T09:07:39.044Z [PID:33503] [info] : Found 4 available upgrades. Starting to upgradeDB...
0|eosio-contract-api-filler | 2022-10-22T09:07:39.044Z [PID:33503] [info] : Upgrade to 1.3.14 ...
0|eosio-contract-api-filler | 2022-10-22T09:07:39.046Z [PID:33503] [info] : Upgraded atomicassets to 1.3.14
0|eosio-contract-api-filler | 2022-10-22T09:07:39.046Z [PID:33503] [info] : Upgraded delphioracle to 1.3.14
0|eosio-contract-api-filler | 2022-10-22T09:07:54.506Z [PID:33503] [error] : Failed to execute migration scripts relation "atomicmarket_stats_prices_master" does not exist {"length":2316,"name":"error","severity":"ERROR","code":"42P01","internalPosition":"108","internalQuery":"WITH templates AS MATERIALIZED (\n SELECT DISTINCT template_id, assets_contract\n FROM atomicmarket_stats_prices_master\n WHERE template_id IS NOT NULL\n ), sales AS MATERIALIZED (\n SELECT assets_contract, SUBSTRING(f FROM 2)::BIGINT template_id, MIN(price) min_price\n FROM atomicmarket_sales_filters_listed\n JOIN LATERAL UNNEST(filter) u(f) ON u.f LIKE 't%'\n WHERE seller_contract IS DISTINCT FROM TRUE\n AND asset_count = 1\n \t AND updated_at_time + 0 <= (current_block_time - 3600 * 24 * 3 * 1000) -- only include sales older than 3 days\n GROUP BY template_id, assets_contract\n )\n SELECT template_id, assets_contract, sug.suggested_median, sug.suggested_average\n FROM templates\n LEFT OUTER JOIN sales USING (template_id, assets_contract)\n CROSS JOIN LATERAL (\n SELECT\n LEAST(PERCENTILE_DISC(0.5) WITHIN GROUP (ORDER BY price), sales.min_price) suggested_median,\n LEAST(AVG(price)::BIGINT, sales.min_price) suggested_average\n FROM (\n (\n SELECT listing_id /* not used, but required to prevent the same price being discarded in the union*/, price\n FROM atomicmarket_stats_prices_master\n WHERE template_id = templates.template_id AND assets_contract = templates.assets_contract\n AND time >= ((extract(epoch from now() - '3 days'::INTERVAL)) * 1000)::BIGINT\n )\n UNION\n (\n SELECT listing_id, price\n FROM atomicmarket_stats_prices_master\n WHERE template_id = templates.template_id AND assets_contract = templates.assets_contract\n ORDER BY time DESC\n LIMIT 5\n )\n ) prices\n ) sug","where":"PL/pgSQL function update_atomicmarket_template_prices() line 8 at FOR over SELECT rows","file":"parse_relation.c","line":"1384","routine":"parserOpenTable","stack":"error: relation "atomicmarket_stats_prices_master" does not exist\n at Parser.parseErrorMessage (/root/eosio-contract-api/node_modules/pg-protocol/src/parser.ts:369:69)\n at Parser.handlePacket (/root/eosio-contract-api/node_modules/pg-protocol/src/parser.ts:188:21)\n at Parser.parse (/root/eosio-contract-api/node_modules/pg-protocol/src/parser.ts:103:30)\n at Socket. (/root/eosio-contract-api/node_modules/pg-protocol/src/index.ts:7:48)\n at Socket.emit (node:events:520:28)\n at addChunk (node:internal/streams/readable:315:12)\n at readableAddChunk (node:internal/streams/readable:289:9)\n at Socket.Readable.push (node:internal/streams/readable:228:10)\n at TCP.onStreamRead (node:internal/stream_base_commons:190:23)"}
0|eosio-contract-api-filler | 2022-10-22T09:07:54.507Z [PID:33503] [info] : Process stopping - cleaning up transactions...
0|eosio-contract-api-filler | 2022-10-22T09:07:54.507Z [PID:33503] [info] : All transactions aborted
0|eosio-contract-api-filler | 2022-10-22T09:07:54.960Z [PID:33525] [info] : Starting workers...

I see that atomicmarket_stats_prices_master's removed from v1.3.13 So in atomicmarket.sql migration script. We should replace atomicmarket_stats_prices_master with atomicmarket_stats_markets. Is it correct?

Telos testnet not syncing

I was busy syncing the Telos testnet from block 104880000, but then it stopped halfway through with the following error.

eosio-contract-api-filler      | 2022-05-20T04:51:20.930Z [PID:39] [info] : Launching deserialization worker... 
eosio-contract-api-filler      | 2022-05-20T04:51:21.242Z [PID:39] [warn] : Could not find ABI for atomicmarket in cache, so requesting it... 
eosio-contract-api-filler      | 2022-05-20T04:51:22.459Z [PID:39] [warn] : Could not find ABI for atomicassets in cache, so requesting it... 
eosio-contract-api-postgres    | 2022-05-20 04:51:23.852 UTC [35] ERROR:  insert or update on table "atomicmarket_auctions" violates foreign key constraint "atomicmarket_auctions_maker_marketplace_fkey"
eosio-contract-api-postgres    | 2022-05-20 04:51:23.852 UTC [35] DETAIL:  Key (market_contract, maker_marketplace)=(atomicmarket, ) is not present in table "atomicmarket_marketplaces".
eosio-contract-api-postgres    | 2022-05-20 04:51:23.852 UTC [35] STATEMENT:  COMMIT
eosio-contract-api-filler      | 2022-05-20T04:51:23.853Z [PID:39] [error] : Failed to execute SQL query  {"queryText":"COMMIT","values":[],"error":{"length":386,"name":"error","severity":"ERROR","code":"23503","detail":"Key (market_contract, maker_marketplace)=(atomicmarket, ) is not present in table \"atomicmarket_marketplaces\".","schema":"public","table":"atomicmarket_auctions","constraint":"atomicmarket_auctions_maker_marketplace_fkey","file":"ri_triggers.c","line":"2465","routine":"ri_ReportViolation"}}
eosio-contract-api-filler      | 2022-05-20T04:51:23.853Z [PID:39] [error] : Error occurred while executing block range from #109316981 to 109316990 
eosio-contract-api-postgres    | 2022-05-20 04:51:23.854 UTC [35] WARNING:  there is no transaction in progress
eosio-contract-api-filler      | 2022-05-20T04:51:23.855Z [PID:39] [error] : Consumer queue stopped due to an error at #109316990 Release called on client which has already been released to the pool. {"stack":"Error: Release called on client which has already been released to the pool.\n    at throwOnDoubleRelease (/home/application/app/node_modules/pg-pool/index.js:27:9)\n    at Client.release (/home/application/app/node_modules/pg-pool/index.js:296:9)\n    at ContractDBTransaction.abort (/home/application/app/build/filler/database.js:485:25)\n    at processTicksAndRejections (node:internal/process/task_queues:96:5)\n    at async StateReceiver.process (/home/application/app/build/filler/receiver.js:175:17)\n    at async /home/application/app/build/filler/receiver.js:90:17\n    at async run (/home/application/app/node_modules/p-queue/dist/index.js:163:29)"}
eosio-contract-api-filler      | 2022-05-20T04:51:30.769Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:40|JQ:0] - Stopping in 3590 seconds 

Any ideas?

Running api without local node

I'm trying to eventually run this API for the Telos testnet and mainnet, but I am a bit stuck.

My first attempt at connecting to wax testnet

I used the default settings for all the config files. I just changed the start_block to 35795000
And I changed the connections such that

"postgres": {
    "host": "eosio-contract-api-postgres",
    "port": 5432,
    "user": "root",
    "password": "changeme",
    "database": "root"
  },
  "redis": {
    "host": "eosio-contract-api-redis",
    "port": 6379
  },
  "chain": {
    "name": "wax-testnet",
    "chain_id": "f16b1833c747c43682f4386fca9cbb327929334a762755ebec17f6f23c9b8a12",
    "http": "https://testnet.wax.pink.gg",
    "ship": "wss://testnet.wax.pink.gg"
  }

Then I ran docker-compose up -d
I got the same issue in the filler as #51 (No blocks processed [DS:0|SH:0|JQ:0])

My first question is: Is it okay to point to another endpoint with state history like this? Does that endpoint even have state-history? I just assumed it would, how can I know?

Secondly: I had to change the host to the docker container names or else it won't connect. Is there a better way to do this? If it's localhost then the yarn test works, but if it is like above, then it fails. I'm sure this is just docker things I don't understand yet?

Thanks

Prometheus Metrics

I'm not expert with prometheus. But I would like to extract the metrics available on the API.

I see it's running at /metrics. If I query that I don't see anything.

How do I go about scraping those metrics? Any tips or guides will be appreciated

Is there a way to know what collections are whitelisted/verified?

I want to make a notification service for my discord server to automatically notify my community about important drops.
Looking at the API I could not figure out a way to tell if a collection is verified or whitelisted or if it is a potential scam.

Is there a way to get this information through an api and not just through a browser and looking for the checkmarks?

Materialized views removed, slow querying

We've been trying to use this to index, but in recent versions the materialized views were removed and it's much slower to query, any reason for that?

Any suggestions to improve on that?

Filler Starts But Won't Process Blocks

Followed the install directions located at https://github.com/pinknetworkx/eosio-contract-api

Server starts, populates database tables, maintains connection to the database and redis servers, but never attempts to connect to blockchain API or SHIP nodes. Port 9001 is open and listing for connections, but no log files are written to error.log file in the logs directory (or anywhere else).

Unclear how to troubleshoot or force log output. Any assistance would be greatly appreciated.

[Bug] relation "atomicmarket_stats_markets_updates" does not exist at character 10466

eosio-contract-api-filler    | 2022-09-23T15:44:47.425Z [PID:28] [info] : Starting workers... 
eosio-contract-api-filler    | 2022-09-23T15:44:47.679Z [PID:28] [info] : Checking for available upgrades... 
eosio-contract-api-filler    | 2022-09-23T15:44:47.682Z [PID:28] [info] : Found 3 available upgrades. Starting to upgradeDB... 
eosio-contract-api-filler    | 2022-09-23T15:44:47.682Z [PID:28] [info] : Upgrade to 1.3.15 ... 
eosio-contract-api-postgres  | 2022-09-23 15:44:47.761 UTC [36] ERROR:  relation "atomicmarket_stats_markets_updates" does not exist at character 10466
eosio-contract-api-postgres  | 2022-09-23 15:44:47.761 UTC [36] STATEMENT:  /*
eosio-contract-api-postgres  |  -- Run before upgrade to make the migration faster:

Getting this when running docker compose up. With a new clean repo.
This suddenly started happening this week and wasn't a problem before.
I didn't pull the latest changes. So I assume it's something external that changed.
Anyone else also getting this all of a sudden?

EDIT:
This is with the latest release 1.3.17.
This happened because I removed the delphioracle and market contract from the readers.config.
Resulting in a catch 22, the contracts aren't created but are dependant.
The migrations might just need a review.

My temporary solution was just to roll back to previous release.

Bug with POST requests ignoring page and limit arguments

I found a bug with how atomicassets-js and the API interact.
It is caused by this commit: pinknetworkx/atomicassets-js@2bfed57

This commit makes it so if a query string is too long it will use POST instead of GET. Great in theory but this encodes args as JSON, which makes limit and page integers.

If you make a POST request to /atomicassets/v1/assets with a JSON body with page and limit as integers the API ignores these. The API will only use them if they are sent as strings, which atomicassets-js does not do.

This should be fixed on the API side so that page and limit as integers are not ignored.

Filler server: No blocks processed [DS:0|SH:0|JQ:1]

I started this project with docker.

These are the logs from the filler server.

2022-02-19T15:45:52.592Z [PID:39] [info] : Ship connect options {"start_block_num":166284797,"end_block_num":4294967295,"max_messages_in_flight":50,"have_positions":"removed","irreversible_only":false,"fetch_block":true,"fetch_traces":true,"fetch_deltas":true} 2022-02-19T15:45:52.682Z [PID:39] [info] : Receiving ABI from ship... 2022-02-19T15:45:54.490Z [PID:39] [info] : Launching deserialization worker... 2022-02-19T15:45:54.511Z [PID:39] [info] : Launching deserialization worker... 2022-02-19T15:45:54.502Z [PID:39] [info] : Launching deserialization worker... 2022-02-19T15:45:54.518Z [PID:39] [info] : Launching deserialization worker... 2022-02-19T15:45:54.531Z [PID:39] [info] : Launching deserialization worker... 2022-02-19T15:45:54.536Z [PID:39] [info] : Launching deserialization worker... 2022-02-19T15:45:54.534Z [PID:39] [info] : Launching deserialization worker... 2022-02-19T15:45:54.547Z [PID:39] [info] : Launching deserialization worker... 2022-02-19T15:45:54.515Z [PID:39] [info] : Launching deserialization worker... 2022-02-19T15:45:54.585Z [PID:39] [info] : Launching deserialization worker... 2022-02-19T15:46:02.600Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3590 seconds 2022-02-19T15:46:07.601Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3585 seconds 2022-02-19T15:46:12.602Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3580 seconds 2022-02-19T15:46:17.603Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3575 seconds 2022-02-19T15:46:22.604Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3570 seconds 2022-02-19T15:46:27.606Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3565 seconds 2022-02-19T15:46:32.607Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3560 seconds 2022-02-19T15:46:37.608Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3555 seconds 2022-02-19T15:46:42.609Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3550 seconds 2022-02-19T15:46:47.610Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3545 seconds 2022-02-19T15:46:52.610Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3540 seconds 2022-02-19T15:46:57.611Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3535 seconds 2022-02-19T15:47:02.613Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3530 seconds 2022-02-19T15:47:07.614Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3525 seconds 2022-02-19T15:47:12.615Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3520 seconds 2022-02-19T15:47:17.616Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3515 seconds 2022-02-19T15:47:22.616Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3510 seconds 2022-02-19T15:47:27.617Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3505 seconds 2022-02-19T15:47:32.617Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3500 seconds 2022-02-19T15:47:37.618Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3495 seconds 2022-02-19T15:47:42.618Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3490 seconds 2022-02-19T15:47:47.619Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3485 seconds 2022-02-19T15:47:52.619Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3480 seconds 2022-02-19T15:47:57.619Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3475 seconds 2022-02-19T15:48:02.620Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3470 seconds 2022-02-19T15:48:07.620Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3465 seconds 2022-02-19T15:48:12.620Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3460 seconds 2022-02-19T15:48:17.620Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3455 seconds 2022-02-19T15:48:22.620Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3450 seconds 2022-02-19T15:48:27.620Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3445 seconds 2022-02-19T15:48:32.620Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3440 seconds 2022-02-19T15:48:37.620Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3435 seconds 2022-02-19T15:48:42.620Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3430 seconds 2022-02-19T15:48:47.619Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3425 seconds 2022-02-19T15:48:52.620Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3420 seconds 2022-02-19T15:48:57.619Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3415 seconds 2022-02-19T15:49:02.620Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3410 seconds 2022-02-19T15:49:07.620Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3405 seconds 2022-02-19T15:49:12.620Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3400 seconds 2022-02-19T15:49:17.621Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3395 seconds 2022-02-19T15:49:22.620Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3390 seconds 2022-02-19T15:49:27.621Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3385 seconds 2022-02-19T15:49:32.628Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3380 seconds 2022-02-19T15:49:37.627Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3375 seconds 2022-02-19T15:49:42.628Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3370 seconds 2022-02-19T15:49:47.628Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3365 seconds 2022-02-19T15:49:52.628Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3360 seconds 2022-02-19T15:49:57.640Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3355 seconds 2022-02-19T15:50:02.640Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3350 seconds 2022-02-19T15:50:07.640Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3345 seconds 2022-02-19T15:50:12.640Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3340 seconds 2022-02-19T15:50:17.640Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3335 seconds 2022-02-19T15:50:22.641Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3330 seconds 2022-02-19T15:50:27.641Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3325 seconds 2022-02-19T15:50:32.642Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3320 seconds 2022-02-19T15:50:37.640Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3315 seconds 2022-02-19T15:50:42.641Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3310 seconds 2022-02-19T15:50:47.641Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3305 seconds 2022-02-19T15:50:52.641Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3300 seconds 2022-02-19T15:50:57.642Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3295 seconds 2022-02-19T15:51:02.646Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:0] - Stopping in 3290 seconds 2022-02-19T15:51:15.664Z [PID:39] [warn] : Reader atomic-1 - No blocks processed [DS:0|SH:0|JQ:1] - Stopping in 3277 seconds

On the last line, I can see it change JQ to 1, but the server just stopped here for hours and nothing happened.
Please help me with this one.

error on wax test network block 5713181

I'm getting this in the logs and the filler stops processing blocks:

2022-04-01T14:11:03.685Z [PID:39] [error] : Error while processing data {"0":{"block_num":5713181,"block_id":"00572D1D5F287031296670F05D98B0BF5E74D858AAD890D8FAEB0CBCB201B86C","timestamp":"2020-01-08T14:06:54.500","producer":"blokcrafters","confirmed":0,"previous":"00572D1CBD84C4A07F9C5E08BC01D3642EF580A2717F063BF6AE22B7621C139F","transaction_mroot":"F55EAF60235FF5C72DF75DE13B392C1615C5A40D0356E01040283FD890AE1AA1","action_mroot":"4AD99641CEE022FC20B69EAD2E7ADAA575BBEAAB91812E7F5554F60C2307202F","schedule_version":44,"new_producers":null,"header_extensions":[],"producer_signature":"SIG_K1_KVa4tGBeT1Qxr7PEtje55iV7FTzZdtAHoA281Yi6oQ5GU9aVRkVUXp33AzACGTHCcr7Z9isNHaKJxLKzHwLaYmJ2ABKWzQ","transactions":[{"status":0,"cpu_usage_us":100,"net_usage_words":12,"trx":["packed_transaction",{"signatures":["SIG_K1_KfYkkbAHuos45XnM22RKU4k3UkyT9EZjc5oGbTFAc17gqzeSUsGCRVXfZK3ue1veqHdhw7mYz5fzBy3R33EQ3YAJnmbmpW"],"compression":0,"packed_context_free_data":{},"packed_trx":{"0":26,"1":226,"2":21,"3":94,"4":24,"5":45,"6":45,"7":139,"8":12,"9":243,"10":0,"11":0,"12":0,"13":0,"14":1,"15":128,"16":179,"17":194,"18":216,"19":32,"20":39,"21":105,"22":54,"23":0,"24":0,"25":0,"26":0,"27":0,"28":144,"29":221,"30":116,"31":1,"32":128,"33":179,"34":194,"35":216,"36":32,"37":39,"38":105,"39":54,"40":0,"41":0,"42":0,"43":0,"44":168,"45":237,"46":50,"47":50,"48":0,"49":0}}]}],"block_extensions":[],"last_irreversible":{"block_num":145901327,"block_id":"08B2470FDDEAB24C57E6801CF01252050220992EDFDE62B64A77A9FE722849AE"},"head":{"block_num":145901659,"block_id":"08B2485BDB246A0AF4223F855AEB010103B1177D5E4DDE7A047D4D24723FF077"}},"1":{"code":"atomicassets","scope":"atomicassets","table":"config","primary_key":"4982871454518345728","payer":"atomicassets","value":{"asset_counter":"1099511627780","offer_counter":"0","collection_format":[]},"present":true}}
2022-04-01T14:11:03.686Z [PID:39] [error] : Error occurred while executing block range from #5713172 to 5713181
2022-04-01T14:11:03.686Z [PID:39] [error] : Consumer queue stopped due to an error at #5713181 Cannot read properties of undefined (reading 'length') {"stack":"TypeError: Cannot read properties of undefined (reading 'length')\n at Object.callback (/home/application/app/build/filler/handlers/atomicassets/processors/config.js:12:82)\n at DataProcessor.executeHeadQueue (/home/application/app/build/filler/processor.js:236:36)\n at StateReceiver.process (/home/application/app/build/filler/receiver.js:154:38)\n at async /home/application/app/build/filler/receiver.js:90:17\n at async run (/home/application/app/node_modules/p-queue/dist/index.js:163:29)"}

Metrics Guide

I'd like to request that the readme just be updated to include how to setup and use the prometheus metrics.

Perhaps just an example of how you are currently scraping the data etc.

Or even just some resources that might be helpful

Thanks!

[Error] running job Refresh MV

I upgraded from v1.3.11 to v1.3.17 and getting these errors. Readers's still working good.

Error running job Refresh MV atomicmarket_sale_prices relation "atomicmarket_sale_prices" does not exist {"length":122,"name":"error","severity":"ERROR","code":"42P01","file":"namespace.c","line":"435","routine":"RangeVarGetRelidExtended","stack":"error: relation "atomicmarket_sale_prices" does not exist\n at Parser.parseErrorMessage (/root/eosio-contract-api/node_modules/pg-protocol/src/parser.ts:369:69)\n at Parser.handlePacket (/root/eosio-contract-api/node_modules/pg-protocol/src/parser.ts:188:21)\n at Parser.parse (/root/eosio-contract-api/node_modules/pg-protocol/src/parser.ts:103:30)\n at Socket. (/root/eosio-contract-api/node_modules/pg-protocol/src/index.ts:7:48)\n at Socket.emit (node:events:520:28)\n at addChunk (node:internal/streams/readable:315:12)\n at readableAddChunk (node:internal/streams/readable:289:9)\n at Socket.Readable.push (node:internal/streams/readable:228:10)\n at TCP.onStreamRead (node:internal/stream_base_commons:190:23)"}
0|eosio-contract-api-filler | 2022-10-22T13:36:11.951Z [PID:5200] [error] : Error running job Refresh MV atomicmarket_template_prices "atomicmarket_template_prices" is not a materialized view {"length":121,"name":"error","severity":"ERROR","code":"0A000","file":"matview.c","line":"176","routine":"ExecRefreshMatView","stack":"error: "atomicmarket_template_prices" is not a materialized view\n at Parser.parseErrorMessage (/root/eosio-contract-api/node_modules/pg-protocol/src/parser.ts:369:69)\n at Parser.handlePacket (/root/eosio-contract-api/node_modules/pg-protocol/src/parser.ts:188:21)\n at Parser.parse (/root/eosio-contract-api/node_modules/pg-protocol/src/parser.ts:103:30)\n at Socket. (/root/eosio-contract-api/node_modules/pg-protocol/src/index.ts:7:48)\n at Socket.emit (node:events:520:28)\n at addChunk (node:internal/streams/readable:315:12)\n at readableAddChunk (node:internal/streams/readable:289:9)\n at Socket.Readable.push (node:internal/streams/readable:228:10)\n at TCP.onStreamRead (node:internal/stream_base_commons:190:23)"}
0|eosio-contract-api-filler | 2022-10-22T13:36:12.932Z [PID:5200] [error] : Error running job Refresh MV atomicmarket_stats_prices relation "atomicmarket_stats_prices" does not exist {"length":123,"name":"error","severity":"ERROR","code":"42P01","file":"namespace.c","line":"435","routine":"RangeVarGetRelidExtended","stack":"error: relation "atomicmarket_stats_prices" does not exist\n at Parser.parseErrorMessage (/root/eosio-contract-api/node_modules/pg-protocol/src/parser.ts:369:69)\n at Parser.handlePacket (/root/eosio-contract-api/node_modules/pg-protocol/src/parser.ts:188:21)\n at Parser.parse (/root/eosio-contract-api/node_modules/pg-protocol/src/parser.ts:103:30)\n at Socket. (/root/eosio-contract-api/node_modules/pg-protocol/src/index.ts:7:48)\n at Socket.emit (node:events:520:28)\n at addChunk (node:internal/streams/readable:315:12)\n at readableAddChunk (node:internal/streams/readable:289:9)\n at Socket.Readable.push (node:internal/streams/readable:228:10)\n at TCP.onStreamRead (node:internal/stream_base_commons:190:23)"}
0|eosio-contract-api-filler | 2022-10-22T13:36:13.917Z [PID:5200] [error] : Error running job Refresh MV atomicmarket_stats_markets "atomicmarket_stats_markets" is not a materialized view {"length":119,"name":"error","severity":"ERROR","code":"0A000","file":"matview.c","line":"176","routine":"ExecRefreshMatView","stack":"error: "atomicmarket_stats_markets" is not a materialized view\n at Parser.parseErrorMessage (/root/eosio-contract-api/node_modules/pg-protocol/src/parser.ts:369:69)\n at Parser.handlePacket (/root/eosio-contract-api/node_modules/pg-protocol/src/parser.ts:188:21)\n at Parser.parse (/root/eosio-contract-api/node_modules/pg-protocol/src/parser.ts:103:30)\n at Socket. (/root/eosio-contract-api/node_modules/pg-protocol/src/index.ts:7:48)\n at Socket.emit (node:events:520:28)\n at addChunk (node:internal/streams/readable:315:12)\n at readableAddChunk (node:internal/streams/readable:289:9)\n at Socket.Readable.push (node:internal/streams/readable:228:10)\n at TCP.onStreamRead (node:internal/stream_base_commons:190:23)"}
0|eosio-contract-api-filler | 2022-10-22T13:36:13.917Z [PID:5200] [info] : Reader atomic-1 - Progress: 209959621 / 209966908 (54.82%) Speed: 17.2 B/s 335 W/s [DS:0|SH:23|JQ:1] (Syncs in 7 minutes)

Simpleassets filler crashes at various steps

Hi -- I'm using eosio-contract-api to sync with the simpleassets contract and I stumbled upon some issues running the filler.

Sorry about the long list and thanks for taking the time to look into this!
Here's a list of what I've found, below there's also the steps to reproduce.

  1. simpleassets.sql file throws during the queries execution

    • simpleassets_authors specifies the constraint CONSTRAINT simpleassets_collections_pkey PRIMARY KEY (contract, collection_name), but the table itself does not create a field called collection_name which makes the query throw. This is also inconsistent with the simpleassets contract, which doesn't have collections as part of the contract specification.
    • On line 70 and 71 there's two indexes being created with the same name, also makes the query throw.
  2. definitions/migrations/1.2.3/database.sql throws multiple primary keys for table "contract_traces" are not allowed

  3. Under filler/handlers/simpleassets/index.ts, line 90.

    • This line throws with the following error: INSERT has more expressions than target columns removing the $3 on the query seems to fix the issue.
  4. Under filler/handlers/simpleassets/processors/assets.ts, line 30.

    • assetid is mistyped as asseetid, this also makes the query throw because we expect a non-null value for asset_id inside the db.
  5. This is the last one I've found, also under filler/handlers/simpleassets/processors/assets.ts

    • Under the claim action, the query expects three variables but there's only two being passed, removing the AND owner = $3 seems to fix the issue.

This is my readers.json:

readers.json
[
  {
    "name": "simple-1",

    "start_block":  12324528,
    "stop_block": 0,
    "irreversible_only": false,

    "ship_prefetch_blocks": 50,
    "ship_min_block_confirmation": 30,
    "ship_ds_queue_size": 20,

    "db_group_blocks": 10,

    "ds_ship_threads": 4,

    "modules": [],

    "contracts": [
      {
        "handler": "simpleassets",
        "args": {
          "simpleassets_account": "simpleassets",
          "store_transfers": true,
          "store_logs": true
        }
      },
      {
        "handler": "delphioracle",
        "args": {
          "delphioracle_account": "delphioracle"
        }
      }
    ]
  }
]

Steps to reproduce:

  1. run yarn start:filler with the above readers.json and see the errors happen

Is the Filler compatible with a State History Plugin loaded from a snapshot?

Hi! From the README.md I can see the following requirements:

  • State History Plugin enabled with options trace-history = true, chain-state-history = true
  • Fully synced for the block range you want to process

I have such a node setup, but it was loaded from a snapshot via the state history plugin, as specified here.

When I run the filler (with 20GB of ram allocated to it), it attempts to pull down the first block.
From my debugging, it then appears that the state history plugin tries to send the entire blockchain up to that first point in a single message, and this message is truncated at around ~2^30 bytes.

I'm wondering if this is an unsupported edge case, or perhaps some missing configuration.
If it is unsupported, I'll gladly submit a PR to specify that in the README.md

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.