Code Monkey home page Code Monkey logo

datahighway-dhx / node Goto Github PK

View Code? Open in Web Editor NEW
51.0 5.0 10.0 39.29 MB

DataHighway Node. A blockchain being built with Substrate to become a parachain on the Polkadot network. Planned features include a decentralized LPWAN roaming hub for LoRaWAN IoT devices and network operator roaming agreements, participative mining, an inter-chain data market, and DAO governance. http://www.datahighway.com

Home Page: http://www.datahighway.com

License: GNU General Public License v3.0

Rust 99.27% Shell 0.63% Dockerfile 0.09%
datahighway mxc iot lpwan lorawan blockchain parachain substrate polkadot roaming

node's People

Contributors

ayushmishra2005 avatar cgroeschel avatar festelo avatar jeffstahlnecker avatar ltfschoen avatar sbcodes avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

node's Issues

Update docs after launch mainnet before release

  • ReadMe "WARNING: This implementation is a proof-of-concept prototype and is not ready for production use."
  • Create release/tag
  • Add instructions for other validators to join Mainnet and how to add session keys etc.
  • Update website docs
  • Ask someone unfamiliar with it all to try and do it and make updates to make it simple

Mining testnet

Mining testnet to have:

  • Distribute 30% of total supply to be fully-diluted/full-unlocked and split into multiple accounts of the DAO Treasury Unlocked Reserves (ready to be claimed manually as mining speed boosts for token mining/signalling by the public through the DAO)
  • Published on Polkascan with ability to view that 30% of total supply is held by the DAO
  • Remaining 70% of the total supply to be available as staking rewards (for running a collator node, or nominating)
  • Ability to lock testnet MXC and IOTA tokens
  • GUI shall be the Polkadot.js Apps

Update to BABE and PoS

I've tried updating to PoS using BABE in this PR https://github.com/DataHighway-DHX/node/compare/luke/2/update-latest-commit-staking?expand=1. I did it the same way laminar.one has done in it in their chain https://github.com/laminar-protocol/laminar-chain

It will allow you to run two local nodes on the 'local' testnet and produce blocks (using 'testnet-latest' gives some error that the balance isn't high enough for bonding Thread 'main' panicked at 'Stash does not have enough balance to bond.', <::std::macros::panic macros>:2 when you try to output the chain definition.

Produce the error by running the following (i.e. with testnet-latest instead of local)

./target/release/datahighway build-spec \
  --chain=testnet-latest > ./src/chain-spec-templates/chain_spec_testnet_poa_latest.json

./target/release/datahighway build-spec \
  --chain ./src/chain-spec-templates/chain_spec_testnet_poa_latest.json \
  --raw > ./src/chain-definition-custom/chain_def_testnet_poa_v0.1.0.json

Error:

2020-02-27 22:35:33 Building chain spec

====================

Version: 0.0.1-83f8579-x86_64-macos

   0: backtrace::backtrace::trace
   1: backtrace::capture::Backtrace::new
   2: sp_panic_handler::set::{{closure}}
   3: std::panicking::rust_panic_with_hook
   4: std::panicking::begin_panic
   5: std::thread::local::LocalKey<T>::with
   6: sp_state_machine::basic::BasicExternalities::execute_with_storage
   7: <datahighway_runtime::GenesisConfig as sp_runtime::BuildStorage>::assimilate_storage
   8: sp_runtime::BuildStorage::build_storage
   9: sc_chain_spec::chain_spec::ChainSpec<G,E>::to_json
  10: datahighway::command::run
  11: datahighway::main
  12: std::rt::lang_start::{{closure}}
  13: std::panicking::try::do_call
  14: __rust_maybe_catch_panic
  15: std::rt::lang_start_internal
  16: main


Thread 'main' panicked at 'Stash does not have enough balance to bond.', <::std::macros::panic macros>:2

And when you try to use polkadot.js.org/apps the following will break in the Staking tab (due to api.derive.imOnline.receivedHeartbeats in Polkadot.js Apps codebase)

TypeError: Cannot read property 'receivedHeartbeats' of undefined

And other tabs show error DRR: TypeError: Cannot read property 'vesting' of undefined

I enquired with laminar.one and they encountered the same errors and decided to build their own front-end https://github.com/laminar-protocol/laminar-chain-apps

Since we're creating an MVP, we won't update to PoS too quickly until BABE has stabilized and the above errors aren't encountered. We'll just use polkadot.js.org/apps using Aura. Note: Edgeware still uses Aura.

Setup Parachain

  • Parathread auction mechanism is in the Polkadot repo.
  • There is no difference between a parachain and a parathread (so far).
  • Each block, a parachain will be guaranteed a slot to put their latest block head, whereas a parathread is not guaranteed that spot, but can auction for it. Once a parathread or parachain gets their slot for that block, they are indistinguishable from the code perspective
  • Notify the relay chain whether your chain is a parachain or a parathread by the way you sign up. To be a parachain, you go through the parachain candle auction process and you register your parachain. As a parathread, you place a one time deposit to be part of the "parathread pool" and then you can do bids each block.
    You can also swap between the two statuses, assuming you find someone of the opposite status to swap with
  • Videos

update chain specs

Whilst its ok to use well known alice/bob/charlie keys for dev and locat testnet, we should use custom session keys for testnet_latest which we also use in harbour and key generation and management should be tested in testnet_latest network first.

Unable to finalize blocks or change session keys

Issue definition

After running both an Alice node and a Bob node, it does not finalize blocks. i.e.

Idle (0 peers), best: #112 (0x4d42…cbd1), finalized #0 (0x3947…2716), ⬇ 0 ⬆ 0

I've tried changing the session keys as follows using insertKey for each key type (i.e. aura, babe, imonline, grandpa), but the output from each cURL request is {"jsonrpc":"2.0","result":null,"id":1}, which means it did not work.

I've been advised by @xlc that when changing validators/session keys the change only comes in effect after the block that is including the change has been finalized, so only change one at a time. Also to wait for two eras (optionally use sudo staking force era to speed it up) and make sure everything is still finalizing before making the second change. After you have everything running smoothly, you can use rotateKeys instead of insertKey.

Reproduce the issue

  • Build the chain spec and raw chain definition used to run the node based on the latest code
./target/release/datahighway build-spec \
  --chain=testnet-latest > ./src/chain-spec-templates/chain_spec_testnet_latest.json

./target/release/datahighway build-spec \
  --chain ./src/chain-spec-templates/chain_spec_testnet_latest.json \
  --raw > ./src/chain-definition-custom/chain_def_testnet_v0.1.0.json
  • Install latest Substrate and Subkey binary.
curl https://getsubstrate.io -sSf | bash && \
./scripts/init.sh
  • Remove previous chains
./target/release/datahighway purge-chain --dev --base-path /tmp/polkadot-chains/alice
./target/release/datahighway purge-chain --dev --base-path /tmp/polkadot-chains/bob
  • Run a node for both Alice (bootnode) and Bob as follows:
SKIP_WASM_BUILD= ./target/release/datahighway --validator \
  --unsafe-ws-external \
  --unsafe-rpc-external \
  --rpc-cors=all \
  --base-path /tmp/polkadot-chains/alice \
  --keystore-path "/tmp/polkadot-chains/alice/keys" \
  --chain ./src/chain-definition-custom/chain_def_testnet_v0.1.0.json \
  --node-key 88dc3417d5058ec4b4503e0c12ea1a0a89be200fe98922423d4334014fa6b0ee \
  --alice \
  --rpc-port 9933 \
  --port 30333 \
  --telemetry-url ws://telemetry.polkadot.io:1024 \
  --ws-port 9944 \
  --execution=native \
  -lruntime=debug
SKIP_WASM_BUILD= ./target/release/datahighway --validator \
  --unsafe-ws-external \
  --unsafe-rpc-external \
  --rpc-cors=all \
  --base-path /tmp/polkadot-chains/bob \
  --keystore-path "/tmp/polkadot-chains/bob/keys" \
  --bootnodes /ip4/127.0.0.1/tcp/30333/p2p/QmWYmZrHFPkgX8PgMgUpHJsK6Q6vWbeVXrKhciunJdRvKZ \
  --chain ./src/chain-definition-custom/chain_def_testnet_v0.1.0.json \
  --bob \
  --rpc-port 9933 \
  --port 30334 \
  --telemetry-url ws://telemetry.polkadot.io:1024 \
  --ws-port 9944 \
  --execution=native \
  -lruntime=debug
  • Generate session keys for Alice
$ subkey --ed25519 inspect "//Alice"
Secret Key URI `//Alice` is account:
  Secret seed:      0xabf8e5bdbe30c65656c0a3cbd181ff8a56294a69dfedd27982aace4a76909115
  Public key (hex): 0x88dc3417d5058ec4b4503e0c12ea1a0a89be200fe98922423d4334014fa6b0ee
  Account ID:       0x88dc3417d5058ec4b4503e0c12ea1a0a89be200fe98922423d4334014fa6b0ee
  SS58 Address:     5FA9nQDVg267DEd8m1ZypXLBnvN7SFxYwV7ndqSYGiN9TTpu

$ subkey --sr25519 inspect "//Alice"//aura
Secret Key URI `//Alice//aura` is account:
  Secret seed:      0x153d8db5f7ef35f18a456c049d6f6e2c723d6c18d1f9f6c9fbee880c2a171c73
  Public key (hex): 0x408f99b525d90cce76288245cb975771282c2cefa89d693b9da2cdbed6cd9152
  Account ID:       0x408f99b525d90cce76288245cb975771282c2cefa89d693b9da2cdbed6cd9152
  SS58 Address:     5DXMabRsSpaMwfNivWjWEnzYtiHsKwQnP4aAKB85429ZQU6v

$ subkey --sr25519 inspect "//Alice"//babe
Secret Key URI `//Alice//babe` is account:
  Secret seed:      0x7bc0e13f128f3f3274e407de23057efe043c2e12d8ed72dc5c627975755c9620
  Public key (hex): 0x46ffa3a808850b2ad55732e958e781146ed1e6436ffb83290e0cb810aacf5070
  Account ID:       0x46ffa3a808850b2ad55732e958e781146ed1e6436ffb83290e0cb810aacf5070
  SS58 Address:     5Dfo9eF9C7Lu5Vbc8LbaMXi1Us2yi5VGTTA7radKoxb7M9HT

$ subkey --sr25519 inspect "//Alice"//imonline
Secret Key URI `//Alice//imonline` is account:
  Secret seed:      0xf54dc00d41d0ea7929ac00a08ed1e111eb8c35d669b011c649cea23997f5d8d9
  Public key (hex): 0xee725cf87fa2d6f264f26d7d8b84b1054d2182cdcce51fdea95ec868be9d1e17
  Account ID:       0xee725cf87fa2d6f264f26d7d8b84b1054d2182cdcce51fdea95ec868be9d1e17
  SS58 Address:     5HTME6o2DqEuoNCxE5263j2dNzFGxspeP8wswenPA3WerfmA

$ subkey --ed25519 inspect "//Alice"//grandpa
Secret Key URI `//Alice//grandpa` is account:
  Secret seed:      0x03bee0237d4847732404fde7539e356da44bce9cd69f26f869883419371a78ab
  Public key (hex): 0x6e2de2e5087b56ed2370359574f479d7e5da1973e17ca1b55882c4773f154d2f
  Account ID:       0x6e2de2e5087b56ed2370359574f479d7e5da1973e17ca1b55882c4773f154d2f
  SS58 Address:     5EZAkmxARDqRz5z5ojuTjacTs2rTd7WRL1A9ZeLvwgq2STA2
  • Run cURL for to insert session key for each key type (i.e. "aura"), by providing the associated secret key, and associated Public key (hex) . Note that I've also tried using the Secret Seed (i.e. 0x153d8db5f7ef35f18a456c049d6f6e2c723d6c18d1f9f6c9fbee880c2a171c73) instead of the Secret Key URI (i.e. //Alice//aura), but that does not work either
curl -vH 'Content-Type: application/json' --data '{ "jsonrpc":"2.0", "method":"author_insertKey", "params":["aura", "0x153d8db5f7ef35f18a456c049d6f6e2c723d6c18d1f9f6c9fbee880c2a171c73", "0x408f99b525d90cce76288245cb975771282c2cefa89d693b9da2cdbed6cd9152"],"id":1 }' localhost:9933
curl -vH 'Content-Type: application/json' --data '{ "jsonrpc":"2.0", "method":"author_insertKey", "params":["babe", "//Alice//babe", "0x46ffa3a808850b2ad55732e958e781146ed1e6436ffb83290e0cb810aacf5070"],"id":1 }' localhost:9933
curl -vH 'Content-Type: application/json' --data '{ "jsonrpc":"2.0", "method":"author_insertKey", "params":["imon", "//Alice//imonline", "0xee725cf87fa2d6f264f26d7d8b84b1054d2182cdcce51fdea95ec868be9d1e17"],"id":1 }' localhost:9933
curl -vH 'Content-Type: application/json' --data '{ "jsonrpc":"2.0", "method":"author_insertKey", "params":["gran", "//Alice//grandpa", "0x6e2de2e5087b56ed2370359574f479d7e5da1973e17ca1b55882c4773f154d2f"],"id":1 }' localhost:9933
  • Problems encountered
    • Output from each cURL request is {"jsonrpc":"2.0","result":null,"id":1}, when it should be: {"jsonrpc":"2.0","result":"0x...","id":1}
      • Example output:
$ curl -vH 'Content-Type: application/json' --data '{ "jsonrpc":"2.0", "method":"author_insertKey", "params":["aura", "0x153d8db5f7ef35f18a456c049d6f6e2c723d6c18d1f9f6c9fbee880c2a171c73", "0x408f99b525d90cce76288245cb975771282c2cefa89d693b9da2cdbed6cd9152"],"id":1 }' localhost:9933
* Rebuilt URL to: localhost:9933/
*   Trying ::1...
* TCP_NODELAY set
* Connection failed
* connect to ::1 port 9933 failed: Connection refused
*   Trying 127.0.0.1...
* TCP_NODELAY set
* Connected to localhost (127.0.0.1) port 9933 (#0)
> POST / HTTP/1.1
> Host: localhost:9933
> User-Agent: curl/7.54.0
> Accept: */*
> Content-Type: application/json
> Content-Length: 214
> 
* upload completely sent off: 214 out of 214 bytes
< HTTP/1.1 200 OK
< content-type: application/json; charset=utf-8
< content-length: 39
< date: Tue, 17 Mar 2020 03:42:35 GMT
< 
{"jsonrpc":"2.0","result":null,"id":1}
* Connection #0 to host localhost left intact
  • Empty folder after running cURL: /tmp/polkadot-chains/alice/keys

Duplicate lang item errors upon importing custom env module into custom runtime module

This issue isn't necessary to resolve, I'm just trying to get something to work as part of my integration test in this branch "luke/DHX-114/unable-mock-unique-id".

The test calls the create extrinsic function of the roaming-agreement-policy runtime module in order to create a roaming agreement policy. The extrinsic function assigns a random value to the RoamingAgreementPolicy struct by using the random_value function and returns that struct or an error, as shown here: https://github.com/DataHighway-com/node/compare/luke/DHX-114/unable-mock-unique-id?expand=1#diff-f700a0b0e1a64f5c0bce4eade50947b2R247

In the test, I don't care if it's a random value, I just want to know that it was created, which I could do by checking that the count of roaming agreement policies increased from 0 to 1 (as done here https://github.com/DataHighway-com/node/compare/luke/DHX-114/unable-mock-unique-id?expand=1#diff-f700a0b0e1a64f5c0bce4eade50947b2R257)

But I would also like to know how to conditionally run something different in the extrinsic function (i.e. assign a non-unique value instead of a random value), but only when i'm running tests, which I've tried to do here

To find out whether the extrinsic function is being called by a test, I'm using my custom env package to check the value of the RUST_ENV variable. If I call env::config::get_env() it will return "TEST" if the function has been called when using cargo test.

But when I try to run the tests with the changes in this branch, I get the following error:

error: duplicate lang item in crate `std` (which `env` depends on): `panic_impl`.
  |
  = note: first defined in crate `sr_io` (which `roaming_agreement_policies` depends on)

error: duplicate lang item in crate `std` (which `env` depends on): `oom`.
  |
  = note: first defined in crate `sr_io` (which `roaming_agreement_policies` depends on)

error: aborting due to 2 previous errors

error: could not compile `roaming-agreement-policies`.

Someone else encountered a similar error message on their Substrate-based codebase here paritytech/substrate#2971, but I'm not sure if it's related.

To replicate the issue, clone this branch, then update Rust, and build and just test the integration tests as follows:

git clone https://github.com/DataHighway-com/node && \
cd node && \
curl https://getsubstrate.io -sSf | bash -s -- --fast && \
./scripts/init.sh && \
cargo build --release && \
cargo test -p node-runtime

Fix test for is_premine and is_supernode_claim_reasonable

we've added functionality so it limits withdrawals to 5000 DHX per day if IsPremine in storage is set to false, and it works in the browser.

but i can't get the associated test to pass as it won't accept the u128 value for a u64 argument in the tests as it produces this error

Screenshot 2021-04-07 at 11 58 00 am

here's the test code that i've had to comment out that triggers this error

            // 29th March 2021 @ ~2am is 1616968800000u64
            // https://currentmillis.com/
            Timestamp::set_timestamp(1616968800000u64);

            assert_ok!(MiningEligibilityProxyTestModule::set_is_premine(
                Origin::signed(0),
                false,
            ));

            let rewardee_data_high = MiningEligibilityProxyClaimRewardeeData {
                proxy_claim_rewardee_account_id: 3,
                proxy_claim_reward_amount: 5001000000000000000000,
                proxy_claim_start_date: NaiveDate::from_ymd(2000, 1, 1).and_hms(0, 0, 0).timestamp() * 1000,
                proxy_claim_end_date: NaiveDate::from_ymd(2000, 1, 9).and_hms(0, 0, 0).timestamp() * 1000,
            };
            let mut proxy_claim_rewardees_data_high: Vec<MiningEligibilityProxyClaimRewardeeData<u64, u64, i64, i64>> =
                Vec::new();
                proxy_claim_rewardees_data_high.push(rewardee_data_high);

            assert_err!(
                MiningEligibilityProxyTestModule::proxy_eligibility_claim(
                    Origin::signed(2),
                    5001000000000000000000, // _proxy_claim_total_reward_amount
                    proxy_claim_rewardees_data_high.clone(),
                ),
                "Supernode claim has been deemed unreasonable",
            );

Unable to use floats to check rates are below threshold

In file DataHighway-com/node/packages/mining/mining-speed-boosts/rates/token-mining/src/lib.rs. How to use floats to check these values aren't too large without gettging error

the trait std::str::FromStr is not implemented for <T as Trait>::MiningSpeedBoostRatesTokenMiningMaxToken

This will require changing relevant types from u64 to f64 and u32 to f32 and getting it to compile.


            // if token_token_mxc > "1.2".parse().unwrap() || token_token_iota > "1.2".parse().unwrap() || token_token_dot > "1.2".parse().unwrap() || token_max_token > "1.6".parse().unwrap() || token_max_loyalty > "1.2".parse().unwrap() {
            //   debug::info!("Token rate cannot be this large");
              
            //   return Ok(());
            // }

Unit tests

We should have comprehensive unit tests (i.e. for example, here are the unit tests in the "roaming networks" runtime module: https://github.com/DataHighway-com/node/blob/master/packages/roaming/roaming-networks/src/lib.rs#L252) in each runtime module. These are currently not fully complete.

Currently we have an integration test that covers most of the user flows across multiple runtime modules (i.e. for example, here is the proposed flow of setting up roaming https://github.com/DataHighway-com/node/blob/master/runtime/tests/cli_integration_tests.rs#L264).

We should wait until the architecture of runtime modules is finalised, and then go through the integration test and find relevant aspects that should also be included in individual runtime modules

Unclear what type to assign to AccountStore in balances Trait of a runtime module

In our blockchain's runtime/src/lib.rs, I have the following (which is consistent with what's shown in https://github.com/paritytech/substrate/blob/master/bin/node-template/runtime/src/lib.rs#L219)

impl balances::Trait for Runtime {
    type AccountStore = System;
    ...

then in a specific runtime module where I want to use balances, I have

impl balances::Trait for Test {
        type AccountStore = ();
        ...
}
...

type System = system::Module<Test>;
pub type Balances = balances::Module<Test>;

But when I run the tests for that runtime module with cargo test -p roaming-accounting-policies, I get error:

error[E0277]: the trait bound `(): sp_api_hidden_includes_decl_storage::hidden_include::traits::StoredMap<u64, pallet_balances::AccountData<u64>>` is not satisfied
  --> pallets/roaming/roaming-accounting-policies/src/mock.rs:55:2
   |
54 | impl balances::Trait for Test {
   | ----------------------------- in this `impl` item
55 |     type AccountStore = ();
   |     ^^^^^^^^^^^^^^^^^^^^^^^ the trait `sp_api_hidden_includes_decl_storage::hidden_include::traits::StoredMap<u64, pallet_balances::AccountData<u64>>` is not implemented for `()`

Unfortunately the mock of the node-template's "template" runtime module https://github.com/paritytech/substrate/blob/master/bin/node-template/pallets/template/src/mock.rs doesn't include impl balances::Trait for Test { to show an example of what type should be included.

Update to support multiple tokens

The generic-asset frame https://github.com/paritytech/substrate/tree/master/frame/generic-asset discussed here https://www.youtube.com/watch?v=7qkqEfToH8w&list=PLp0_ueXY_enXRfoaW7sTudeQH10yDvFOS&index=4&t=0s, is/can be a replacement of balances pallet

Whereas laminar.one's orml-tokens & orml-currencies pallets takes a different approach that works along with balances pallet so there are less incompatibilities with Polkadot.js apps

Since we can just use the balances pallet for the DHX token, we will use that for MVP until other approaches stabilize.

Mock RewardDailyData in integration tests instead of relying on generic RewardDailyData from implementation

Instead of using RewardDailyData from the implementation, consider
creating a mock of it instead and decorate it with Debug and so forth
like in the implementation. It doesn't cause any errors at the moment
because RewardDailyData only uses generics in the implementation,
but if it was defined with specific types then it would generate errors

https://github.com/DataHighway-DHX/node/pull/145/files#diff-a733bba89759ce77a887593f0ff42d78d72d53478a436c73a23623f1f6e34740R854

Genesis allocation

Reserve 30% of tokens from the total supply for the DAO Treasury Unlocked Reserves.
Remaining 70% are used for the block reward.
See the whitepaper for details

Style

Indentation should be made consistent

Failing CI tests

When the CI Test build runs cargo test in here https://github.com/DataHighway-DHX/node/runs/459859012, it's generating the following error:

   Compiling js-sys v0.3.35
error[E0603]: struct `Memory` is private
    --> /home/runner/.cargo/registry/src/github.com-1ecc6299db9ec823/js-sys-0.3.35/src/lib.rs:4873:60
     |
4873 |                 let mem = buf.unchecked_ref::<WebAssembly::Memory>();
     |                                                            ^^^^^^ this struct is private
     |
note: the struct `Memory` is defined here
    --> /home/runner/.cargo/registry/src/github.com-1ecc6299db9ec823/js-sys-0.3.35/src/lib.rs:3516:5
     |
3516 |     #[wasm_bindgen]
     |     ^^^^^^^^^^^^^^^
     = note: this error originates in an attribute macro (in Nightly builds, run with -Z macro-backtrace for more info)

block production halts at ~ #159 using any other network then --chain dev

Since we have upgraded to substrate 2.0.0 blocks production and finalization halts somewhere around #159.

Node startup (using locat network in docker)

image

Up until this point everything runs smoothly, blocks then get built/finalized (babe/grandpa) ok and then suddenly no new ones get produced.
There is no error, all peers are still connected ok.
It looks like the nodes no longer find consensus from around #159.

image

Also happens when using the local network (Alice/Bob...), --chain dev does not have the issue.

Restarting the nodes doesn't help, they peer up ok but are stuck at block building from there onwards.

image

To reproduce the issue using docker-compose

  • clone this repp
    git clone https://github.com/DataHighway-DHX/node.git && cd node
  • copy env file
    cp .env.sample .env
  • launch local testnet and wait until block #150 or so, when it will no longer build new ones
    docker-compose up

Change block reward issuance per day from 2400 to 5000

Reward Distribution

DHX reward tokens would be distributed to authorized users every block time using the on_finalize method (example https://github.com/ZainMustafaaa/substrate-aura/blob/master/runtime/src/reward.rs).

Note:

  • type AccountId should be used for user accounts
  • type BalanceOf<T> should be used for rewards. only use u128 for balance in the runtime config if you pass a configuration from the Config (e.g. let reward = BalanceOf::<T>::from(100))
  • assume within a pallet the balance is at least a u32 (not that it is u128)
  • try with reward amount of 100 initially to check it works, using 21 zeros for the average balance, and check that Alice's balance increases by x tokens per block. But we want the Block Reward to be 0.25 (see below)

Block Time vs Block Reward Calculation

In the whitepaper, we've proposed:

  • Block Reward Issuance (of DHX) Daily of 2,400, based on;
  • Block Reward of 0.25, and Block Time of 9 seconds

However we want to change 2,400 to 5,000, so we need to change it referring to existing spreadsheet calculation, the working is as follows:

block_reward_issuance_daily  = block_reward * block_production_rate_min * 60 * 24
block_production_rate_min = 60 / block_time

where we want:

block_reward = 0.25
block_reward_issuance_daily = 5000

so substituting block_production_rate_min we get:

5000  = 0.25 * (60 / block_time) * 60 * 24

so rearranging:

block_time = 60 * 60 * 24 * 0.25 / 5000 = 4.32

....

So assuming we want a block reward of 0.25 initially, we need to update the runtime code to use a block_time of 4.32 seconds where it's defined as MILLISECS_PER_BLOCK to be a value of 4320 (milliseconds), there's currently a duplicate of it in these locations:

and make sure it'd not defined in duplicate here https://github.com/DataHighway-DHX/node/blob/master/runtime/src/lib.rs#L126

Custom Substrate SS58 Address Format for DataHighway network

Implement calculation of eligibility

Uncomment calculate_mining_speed_boost_eligibility_token_mining_result and fix the type errors. Uncomment the call to the same function in DataHighway-com/node/runtime/tests/cli_integration_tests_token_mining.rs and get that integration test to work.

Currently when uncommented the errors as like the following:

166 |                 if token_type != "".to_string() {
    |                                  ^^^^^^^^^^^^^^ expected associated type, found struct `std::string::String`

170 |                     if token_locked_amount != 0 {
    |                                               ^ expected associated type, found integer

error[E0277]: the trait bound `&std::vec::Vec<<T as mining_speed_boosts_sampling_token_mining::Trait>::MiningSpeedBoostSamplingTokenMiningIndex>: _::_parity_scale_codec::EncodeLike<<T as mining_speed_boosts_sampling_token_mining::Trait>::MiningSpeedBoostSamplingTokenMiningIndex>` is not satisfied
   --> packages/mining/mining-speed-boosts/eligibility/token-mining/src/lib.rs:188:27
    |
188 |                           (mining_speed_boost_configuration_token_mining_id, sampling_token_mining_id)
    |                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ the trait `_::_parity_scale_codec::EncodeLike<<T as mining_speed_boosts_sampling_token_mining::Trait>::MiningSpeedBoostSamplingTokenMiningIndex>` is not implemented for `&std::vec::Vec<<T as mining_speed_boosts_sampling_token_mining::Trait>::MiningSpeedBoostSamplingTokenMiningIndex>`

error[E0277]: the trait bound `{integer}: _::_parity_scale_codec::EncodeLike<<T as mining_speed_boosts_rates_token_mining::Trait>::MiningSpeedBoostRatesTokenMiningIndex>` is not satisfied
   --> packages/mining/mining-speed-boosts/eligibility/token-mining/src/lib.rs:199:175
    |
199 | ...                   if let Some(token_mining_rates_config) = <mining_speed_boosts_rates_token_mining::Module<T>>::mining_speed_boost_rates_token_mining_rates_configs(DEFAULT_RATE_CONFIG) {
    |                                                                                                                                                                         ^^^^^^^^^^^^^^^^^^^ the trait `_::_parity_scale_codec::EncodeLike<<T as mining_speed_boosts_rates_token_mining::Trait>::MiningSpeedBoostRatesTokenMiningIndex>` is not implemented for `{integer}`

error[E0308]: mismatched types
   --> packages/mining/mining-speed-boosts/eligibility/token-mining/src/lib.rs:201:56
    |
201 | ...                   if current_token_type == "MXC".to_string() {
    |                                                ^^^^^^^^^^^^^^^^^ expected associated type, found struct `std::string::String`
    |
    = note: expected associated type `<T as mining_speed_boosts_configuration_token_mining::Trait>::MiningSpeedBoostConfigurationTokenMiningTokenType`
                        found struct `std::string::String`
    = note: consider constraining the associated type `<T as mining_speed_boosts_configuration_token_mining::Trait>::MiningSpeedBoostConfigurationTokenMiningTokenType` to `std::string::String` or calling a method that returns `<T as mining_speed_boosts_configuration_token_mining::Trait>::MiningSpeedBoostConfigurationTokenMiningTokenType`
    = note: for more information, visit https://doc.rust-lang.org/book/ch19-03-advanced-traits.html

Unable to implement Debug for roaming agreement policies due to conflict

When Debug is used in DataHighway-com/node/packages/roaming/roaming-agreement-policies/src/lib.rs, as follows

#[cfg_attr(feature = "std", derive(Debug))]
#[derive(Encode, Debug, Decode, Default, Clone, PartialEq)]
// Generic type parameters - Balance
pub struct RoamingAgreementPolicyConfig<U, V> {
    pub policy_activation_type: U, // "passive" or "handover"
    pub policy_expiry: V,
}

It gives the following error:

error[E0119]: conflicting implementations of trait `std::fmt::Debug` for type `RoamingAgreementPolicyConfig<_, _>`:
  --> packages/roaming/roaming-agreement-policies/src/lib.rs:34:18
   |
34 | #[derive(Encode, Debug, Decode, Default, Clone, PartialEq)]
   |                  ^^^^^ conflicting implementation for `RoamingAgreementPolicyConfig<_, _>`
35 | #[cfg_attr(feature = "std", derive(Debug))]
   |                                    ----- first implementation here

error[E0119]: conflicting implementations of trait `std::fmt::Debug` for type `RoamingAgreementPolicyConfig<_, _>`:
  --> packages/roaming/roaming-agreement-policies/src/lib.rs:34:18
   |
34 | #[derive(Encode, Debug, Decode, Default, Clone, PartialEq)]
   |                  ^^^^^ conflicting implementation for `RoamingAgreementPolicyConfig<_, _>`
35 | #[cfg_attr(feature = "std", derive(Debug))]
   |                                    ----- first implementation here

But if I remove Debug so it's just #[derive(Encode, Decode, Default, Clone, PartialEq)] then I get errors when trying to run integration tests, so I've had to comment out that integration test:

error[E0277]: `agreement_policies::RoamingAgreementPolicyConfig<std::vec::Vec<u8>, u64>` doesn't implement `std::fmt::Debug`
   --> runtime/tests/cli_integration_tests_roaming.rs:396:13
    |
396 | /             assert_eq!(
397 | |                 RoamingAgreementPolicyTestModule::roaming_agreement_policy_configs(0),
398 | |                 Some(RoamingAgreementPolicyConfig {
399 | |                     policy_activation_type: "passive".as_bytes().to_vec(),
400 | |                     policy_expiry: 2019,
401 | |                 })
402 | |             );
    | |______________^ `agreement_policies::RoamingAgreementPolicyConfig<std::vec::Vec<u8>, u64>` cannot be formatted using `{:?}` because it doesn't implement `std::fmt::Debug`
    |
    = help: the trait `std::fmt::Debug` is not implemented for `agreement_policies::RoamingAgreementPolicyConfig<std::vec::Vec<u8>, u64>`
    = note: required because of the requirements on the impl of `std::fmt::Debug` for `std::option::Option<agreement_policies::RoamingAgreementPolicyConfig<std::vec::Vec<u8>, u64>>`
    = note: required because of the requirements on the impl of `std::fmt::Debug` for `&std::option::Option<agreement_policies::RoamingAgreementPolicyConfig<std::vec::Vec<u8>, u64>>`
    = note: required by `std::fmt::Debug::fmt`
    = note: this error originates in a macro outside of the current crate (in Nightly builds, run with -Z external-macro-backtrace for more info)

error[E0277]: `agreement_policies::RoamingAgreementPolicyConfig<std::vec::Vec<u8>, u64>` doesn't implement `std::fmt::Debug`
   --> runtime/tests/cli_integration_tests_roaming.rs:396:13
    |
396 | /             assert_eq!(
397 | |                 RoamingAgreementPolicyTestModule::roaming_agreement_policy_configs(0),
398 | |                 Some(RoamingAgreementPolicyConfig {
399 | |                     policy_activation_type: "passive".as_bytes().to_vec(),
400 | |                     policy_expiry: 2019,
401 | |                 })
402 | |             );
    | |______________^ `agreement_policies::RoamingAgreementPolicyConfig<std::vec::Vec<u8>, u64>` cannot be formatted using `{:?}` because it doesn't implement `std::fmt::Debug`

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.