Comments (3)
The stdout logs were coming from protocol.py. I have removed them and updated the config to set the default log level at INFO instead of DEBUG which will tell you the number of nodes found so far every status_internal
seconds:
$ tail -f bitnodes.log
INFO 2013-05-10 05:29:11,435 1952 Starting bitnodes with 188 seed nodes
INFO 2013-05-10 05:30:14,494 1969 Found 351 nodes
INFO 2013-05-10 05:31:35,726 1969 Found 767 nodes
INFO 2013-05-10 05:32:47,370 1969 Found 1114 nodes
INFO 2013-05-10 05:33:47,530 1969 Found 1435 nodes
INFO 2013-05-10 05:34:48,635 1969 Found 1816 nodes
INFO 2013-05-10 05:35:48,948 1969 Found 2170 nodes
INFO 2013-05-10 05:36:49,278 1969 Found 2534 nodes
..
from bitnodes.
Cool! I think if you add a bit of discussion in your README about configuration options, I would have just changed my config myself.
If there are other configuration options (or you add any new ones in the future), then updating the README would be totally cool.
Some ideas:
Looking at the logs, I thought some other updates would be good. One option I think would be cool is to also report the number of unreachable nodes. This might be used to estimate not just how many nodes are online, but how many might come back online in the future.
And it would also be cool if you scheduled unreachable nodes to be re-tested, so you don't miss nodes that might just be offline at the time the script pinged the node.
All in all, a fun and useful utility!
from bitnodes.
Thanks for the suggestions! The README certainly needs some more information once I get to complete the first run of the script.
My current run with "depth" reporting got to over 370K+ nodes after 20 hours and the network depth for some of the workers reaches over 200 nodes deep and still running with ~300 new nodes/min:
$ tail -f bitnodes.log | grep Found
INFO 2013-05-11 10:07:39,051 1503 Found 373505 nodes
INFO 2013-05-11 10:08:43,382 1503 Found 373819 nodes
If this takes over 24 hours (which is likely) some of the nodes are probably stale by then.
I am hoping to be able to complete a run every 24 hours so one way I am going to run the script is to feed it with more heavy nodes and increase the number of workers with a set max_depth and max_age. Setting max_depth will allow the script to search wider and setting max_age will skip peering nodes with old timestamp, e.g. older than 24 hours.
from bitnodes.
Related Issues (20)
- Break down Nodes by full user-agent? HOT 2
- I want to record everytime getaddr response, How can I do that? HOT 1
- install-full-node script is outdated HOT 4
- Feature request: Enable connection to Zerotier Earth enabled nodes HOT 1
- Unresolved reference 'xxx' HOT 1
- is the code for the public api available? HOT 1
- "Check Node" says 'Enter a valid IP address.' HOT 1
- Is Bitnodes mirrored via Onion or I2P? HOT 2
- "Check Node" says 'Enter a valid IP address.' HOT 1
- Bitnode checks on bitcoin node running behind tor HOT 2
- Bitnode checks on bitcoin node over tor: unreachable HOT 1
- Why does it report my node is down when it is online? HOT 11
- Bitnodes reporting inflated IPV4 nodes HOT 2
- Checksum failed HOT 2
- API & Named Fields HOT 1
- Some nodes appear as down but they aren't HOT 6
- Request was throttled HOT 2
- Add Bitcoin Knots option to install-full-node.sh HOT 1
- bitnodes sending sendaddrv2 after verack HOT 1
- v2 Protocol HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from bitnodes.