noncesense-research-lab / blockchain_big_bang Goto Github PK
View Code? Open in Web Editor NEWStudying the upper bound on allowed block size increases under the current dynamic algorithm.
License: MIT License
Studying the upper bound on allowed block size increases under the current dynamic algorithm.
License: MIT License
As we consider ways to adjust the dynamic block size algorithm to limit absurd growth, it will be necessary to model all of the options on a variety of transaction volume scenarios (low traffic, high traffic, bursts, attacks, see #1).
Please use this issue as a forum to brainstorm and share your ideas to be tested in the models.
Current protocol is short-term only memory that last about 3 hours. This is implemented by limiting the block size to twice the median block size of the last 100 blocks. Thus the algorithm is not aware if the block size has increased by 10x in last day, or 1000x in the last week.
Use two the lower of two medians e.g.
(Isthmus suggested & modeled in big bang notebook). I have not tuned these paramters whatsoever... 4x in 1 week is arbitrary numbers for example.
e.g. calculate maximum block size based on last 5000 blocks instead of last 100 blocks. (This might just give us the worst of both worlds)
Select lowest value of median block size at multiple timescales. This is one possible generalization of the LSTM described above. Take the lowest of:
(Isthmus suggested & modeled in big bang notebook). I have not tuned these parameters whatsoever... 100x in 1 month is arbitrary numbers for example.
Turns the exponential growth into linear growth
(Idea from smooth and suraeNoether)
Not quite sure what this means or who suggested it first. Feel free to explain in the comments.
I haven't fully thought this through - found it in surae's summary, but not sure who suggested it first. Feel free to explain in the comments. (I'm curious how this works when traffic is low)
Include the network hashrate (estimated from difficulty and timing) in the calculation of maximum block size. The notion is that this attaches a PoW barrier to blockchain bloat, and uses hash rate as a rough proxy for actual adoption (certainly more meaningful than transaction volume). Since we can't forecast hash rate, we would model several different scenarios (increasing, decreasing, volatile, etc)
At the end of the 18th hour:
-
# Current algorithm
Blocksize = 614400.0 kB
>>> Blockchain size = 117.5009 GB
This is where it gets wrong because current Monero node's source code has a sanity check for all incoming blocks, see core::handle_incoming_block
in src/cryptonote_core/cryptonote_core.cpp
:
if(block_blob.size() > get_max_block_size())
{
LOG_PRINT_L1("WRONG BLOCK BLOB, too big size " << block_blob.size() << ", rejected");
bvc.m_verifivation_failed = true;
return false;
}
get_max_block_size()
returns CRYPTONOTE_MAX_BLOCK_SIZE
which is set to 500000000 bytes (488281.25 KB), so blockchain growth rate can never be more than ~335.271 GiB/day with current code.
We will be modeling different methods (see #2) for deciding how to add an upper bound to dynamic block size expansion. Preliminary work is available in a Jupyter Notebook
We must consider how each algorithm will perform under a variety of circumstances:
What else?
We want to be sure to model all of the normal use patterns AND every edge case we can imagine. Add your ideas in the comments, please. If possible, please include a qualitative description and pseudocode (not mandatory).
What size blocks are necessary to start crashing nodes? From Monero Research Lab meeting on 26-Nov, discussing an edge case that creates blocks so large that they knock out nodes:
@Gingeropolous: ... useful information needed to address the issue: current node processing ability. What is the existing blocksize tipping point for processing
Can we get some experimental data on this? Perhaps on a few different computers. This will be crucial in guiding selection of parameters for the dynamic blocksize algorithm.
(Perhaps instead of Monero testnet, we should make a separate hostile testnet for these types of studies)
My initial simulations assumed that the net fees for the attack transactions must be greater than the coinbase reward (to override the block size penalty financial incentive). This is a 0th order approximation (i.e. not great).
The reward might have to be slightly larger to offset fees.
I thought I saw somebody say that enticing miners to mine a maximum-size (total penalty) block was 4x the coinbase. (I don't quite understand why yet).
Thoughts?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.