nimiq / core-rs Goto Github PK
View Code? Open in Web Editor NEWOfficial Rust implementation of the Nimiq protocol
Home Page: https://nimiq.com
License: Other
Official Rust implementation of the Nimiq protocol
Home Page: https://nimiq.com
License: Other
README
Build doesn't fail
Build fails
cargo +nightly build --release
error: failed to run custom build command for `libargon2-sys v0.2.0 (/home/ubuntu/nimiq/libargon2-sys)`
Caused by:
process didn't exit successfully: `/home/ubuntu/nimiq/target/release/build/libargon2-sys-66ab85cb0894a5ef/build-script-build` (exit code: 1)
--- stdout
TARGET = Some("aarch64-unknown-linux-gnu")
HOST = Some("aarch64-unknown-linux-gnu")
CC_aarch64-unknown-linux-gnu = None
CC_aarch64_unknown_linux_gnu = None
HOST_CC = None
CC = None
CFLAGS_aarch64-unknown-linux-gnu = None
CFLAGS_aarch64_unknown_linux_gnu = None
HOST_CFLAGS = None
CFLAGS = None
CRATE_CC_NO_DEFAULTS = None
DEBUG = Some("false")
CARGO_CFG_TARGET_FEATURE = Some("fp,neon")
running: "cc" "-O2" "-ffunction-sections" "-fdata-sections" "-fPIC" "-I" "native" "-Wall" "-Wextra" "-DARGON2_NO_THREADS" "-o" "/home/ubuntu/nimiq/target/release/build/libargon2-sys-1a40ba79533982af/out/native/argon2.o" "-c" "native/argon2.c"
exit code: 0
running: "cc" "-O2" "-ffunction-sections" "-fdata-sections" "-fPIC" "-I" "native" "-Wall" "-Wextra" "-DARGON2_NO_THREADS" "-o" "/home/ubuntu/nimiq/target/release/build/libargon2-sys-1a40ba79533982af/out/native/core.o" "-c" "native/core.c"
exit code: 0
running: "cc" "-O2" "-ffunction-sections" "-fdata-sections" "-fPIC" "-I" "native" "-Wall" "-Wextra" "-DARGON2_NO_THREADS" "-o" "/home/ubuntu/nimiq/target/release/build/libargon2-sys-1a40ba79533982af/out/native/blake2/blake2b.o" "-c" "native/blake2/blake2b.c"
exit code: 0
running: "cc" "-O2" "-ffunction-sections" "-fdata-sections" "-fPIC" "-I" "native" "-Wall" "-Wextra" "-DARGON2_NO_THREADS" "-o" "/home/ubuntu/nimiq/target/release/build/libargon2-sys-1a40ba79533982af/out/native/opt.o" "-c" "native/opt.c"
cargo:warning=In file included from native/opt.c:26:0:
cargo:warning=native/blake2/blamka-round-opt.h:23:10: fatal error: emmintrin.h: No such file or directory
cargo:warning= #include <emmintrin.h>
cargo:warning= ^~~~~~~~~~~~~
cargo:warning=compilation terminated.
exit code: 1
--- stderr
error occurred: Command "cc" "-O2" "-ffunction-sections" "-fdata-sections" "-fPIC" "-I" "native" "-Wall" "-Wextra" "-DARGON2_NO_THREADS" "-o" "/home/ubuntu/nimiq/target/release/build/libargon2-sys-1a40ba79533982af/out/native/opt.o" "-c" "native/opt.c" with args "cc" did not execute successfully (status code exit code: 1).
Currently, it is very cumbersome to use/debug only parts of the implementation.
The main reason for that is almost all code residing in the main project source.
I propose to make use of more sub-crates (similar to parity-bitcoin).
This also helps cleaning up unwanted dependencies. Currently, for example, our consensus module has dependencies on the network module.
It also speeds up compilation and testing, when only some parts were changed.
Finally, it allows us to compile relevant parts of it to WebAssembly.
As there are many ways to divide the current code into sub-crates, this issue is meant for discussion about how to structure those.
My proposal would be something like:
db
(database layer) ✅hash
(hashes, non-key related cryptography) ✅keys
(private keys, signatures, address,...) ✅primitives
(Target, Coin, Compact, everything related to blocks and transactions, contracts and basic accounts,...) ✅datastructures
(Accounts, Blockchain, ChainStore and other concrete db stores) ✅consensus
(Consensus, ConsensusAgent, Mempool) ✅mnemonic
(mnemonics) ✅key-derivation
(key derivation) ✅network
(all networking) ✅network-primitives
(primitives used in multiple locations) ✅core
(putting things together, but no binary) (currently in main folder ✅)rpc
(JSON-RPC) (to be merged)utils
(most utils, configurable via feature flags) ✅Note: I just quickly wrote down this as a starting point. Looking forward to your ideas. :)
Those that I already created are marked with ✅.
Those that are currently in progress are marked with ☑️.
The example config file includes many "available settings" that are not actually available, such as the dumb mode, light and nano consensus etc.
These should be removed from the example config file for release.
Nimiq is the web-based currency. Why not compile the Rust core to Wasm then?
Useful resources:
I'll do some research on what's required to compile to the new target.
Currently, we only support loading certificates (and private key) from a PKCS#12 file, but most certificate providers (including Let's Encrypt) issue their certificates in separate non-encrypted PEM formatted files, so it makes sense for us to support loading certificates this way too.
I ran into this multiple times, but never got the time to fix it. So here is the issue open for anyone to fix. Or I will fix it, once I got some spare time.
error[E0277]: `keys::AddressParseError` doesn't implement `std::fmt::Display`
--> src/config.rs:52:53
|
52 | Address::from_user_friendly_address(&s).map_err(Error::custom)
| ^^^^^^^^^^^^^ `keys::AddressParseError` cannot be formatted with the default formatter
|
= help: the trait `std::fmt::Display` is not implemented for `keys::AddressParseError`
= note: in format strings you may be able to use `{:?}` (or {:#?} for pretty-print) instead
= note: required by `config::_IMPL_DESERIALIZE_FOR_GenesisConfig::_serde::de::Error::custom`
Since keys::AddressParseError
is an error, we should also implement the necessary traits.
Like in nimiq/core-js#440 of core-js, I think it's useful to have a native module selection for packaged builds, and I'd like to take on this.
My idea is to statically link the different versions of libargon2
into the same binary, giving the function names a suffix of the target via a C macro.
We'd then have argon2d_hash_raw_avx2
, argon2d_hash_raw_sse2
and so on built from the same source.
Like in nimiq/core-js#456, a $PACKAGING environment variable should be introduced, that will trigger a native
build if it's false instead of the avx2
, sse2
… versions.
On startup, Rust then would set up function pointers to the native module and use rust-cupid
to initialize them with the correct implementations.
I also filed #1 to reduce the number of functions so the module is easier to work with.
The alternative would be to dynamically load the implementations like the .node
libraries in Node.js but that'd be more difficult and inconvenient, in my opinion.
Currently, only producers of micro blocks during the latest two epochs are known.
The slot data should be included in ChainInfo
to persist the producer info of historic blocks.
error[E0554]: #![feature] may not be used on the stable release channel
--> collections/src/lib.rs:2:1
|
2 | #![feature(box_into_raw_non_null)]
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
error[E0554]: #![feature] may not be used on the stable release channel
--> collections/src/lib.rs:3:1
|
3 | #![feature(specialization)]
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^
error[E0554]: #![feature] may not be used on the stable release channel
--> collections/src/lib.rs:4:1
|
4 | #![feature(map_get_key_value)]
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Compiling indexmap v1.0.2
error: aborting due to 3 previous errors
For more information about this error, try rustc --explain E0554
.
error: Could not compile nimiq-collections
.
warning: build failed, waiting for other jobs to finish...
error: build failed
The project should build on Mac
The project fails to build
Clone repository.
Run 'cargo build'
error: suffixes on a tuple index are invalid
--> primitives/block/src/target.rs:12:64
|
12 | #[derive(Default, Clone, Copy, PartialEq, PartialOrd, Eq, Ord, Serialize, Deserialize)]
| ^^^^^^^^^ invalid suffix `i32`
error: proc-macro derive produced unparseable tokens
--> primitives/block/src/target.rs:12:64
|
12 | #[derive(Default, Clone, Copy, PartialEq, PartialOrd, Eq, Ord, Serialize, Deserialize)]
| ^^^^^^^^^
error: aborting due to 2 previous errors
error: Could not compile `nimiq-block`.
warning: build failed, waiting for other jobs to finish...
error: build failed
This issue is a meta issue with the information of all the features that should/could be backported from Albatross after the stable release:
Currently, we close our WebSocket connections by:
select
between a one-shot channel and the connection processing future.core-rs/network/src/connection/network_connection.rs
Lines 131 to 159 in 52906bd
With the most recent version of tungstenite and tokio-tungstenite there might be a nicer way to do it:
I only skimmed over the two pull requests, but it seems that it is now possible to send the close frame via the Sink and shut down the Stream part when receiving the other party's close frame.
README
Tried to build core-rs
from source by cloning from Github. I switched to nightly build with rustup default nightly
. As soon as I started building with cargo +nightly build
the compiling stopped with the following error:
error: linking with `cc` failed: exit code: 1
The README of core-rs
states that besides rust nightly you need gcc, pkg-config and libssl-dev, which I installed through apt
.
Install gcc-multilib
as well and it started compiling with no issues.
In primitives/src/coin.rs
there is a comment in the Deserialize
implementation which says:
Check that the value does not exceed Javascript's Number.MAX_SAFE_INTEGER.
Shouldn't we check that then on serialization too? Serialization is currently auto-derived and thus just serializes the u64
even if it's greater than Javascripts MAX_SAFE_INTEGER
.
Also the checked_add
should check this.
The current validator network can handle one concurrent pBFT process created after a proposal is received and its parent block is known. This leads to failures building a pBFT majority in some circumstances on high latency networks, specifically in two cases:
To solve this, the validator network is rewritten to track multiple concurrent pBFT processes for different macro blocks, like so: Map<Blake2bHash, PbftState>
.
The PbftState
can exist in three settings:
At the end of each epoch, old valid and buffered proposals are pruned. Additionally, their macro block hashes are pushed to a blacklist LimitHashSet
to prevent accidental re-creation of a PbftState
by pBFT prepares/commits received after committing the macro block.
https://github.com/nimiq/core-rs/blob/master/client/src/logging.rs#L36
The default log level should be info
, IMHO.
terorie/albatross
devnetvalidator
produces a block, it gets rejected by crate blockchain
if !intended_slot_owner.verify(µ_block.header, &justification) {
warn!("Rejecting block - invalid justification for intended slot owner");
INFO nimiq_validator::validator > Produced block: block_number=73529, view_number=0, hash=73e7f9e...
INFO nimiq_consensus::consensus > Now at block #73529
TRACE nimiq_validator::validator > Next block producer: CompressedPublicKey(ae88f18...)
TRACE nimiq_validator::validator > Push result: Ok(Extended)
INFO nimiq_validator::validator > Produced block: block_number=73530, view_number=0, hash=186435...
WARN nimiq_blockchain_albatross::blockchain > Rejecting block - invalid justification for intended slot owner
DEBUG nimiq_blockchain_albatross::blockchain > Intended slot owner: CompressedPublicKey(ae88f18...)
ERROR nimiq_validator::validator > Failed to push produced micro block to blockchain: InvalidBlock(InvalidJustification)
This is the checklist of pending items/bugs/tasks that need to be addressed in order to release a 1.0 version:
albatross
and backporting anything that makes sensealbatross
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.