Code Monkey home page Code Monkey logo

dappnet-protocol's Introduction

axon

Axon is a proof-of-concept protocol which allows anyone to summon BitTorrent swarms to host files.

By running an Axon node, you will earn tokens for hosting BitTorrent content, based on how much you upload.

Publishers can incent hosting rewards in their own token. For example, you can designate hosting rewards in $SNX.

wen token?

[telegram]

How it works.

In BitTorrent, nodes called peers share chunks of a torrent called pieces with other peers. They keep track of the upload/downloads for each peer they interact with, via an economic protocol called tit-for-tat. Tit-for-tat ensures that torrenting is positive-sum - peers who upload more will get higher better download too.

The BT ecosystem is suppored by trackers. Trackers are centralized servers which track which peers are part of a torrent swarm, as well as serve as a centralized authority of seed ratios (upload/download ratios). Peer discovery hapepns through trackers (centralized) and the Mainline DHT (decentralized) - meaning it is still robust against censorship. Trackers are a tradeoff of more trust for better performance.

We extend BitTorrent in the following ways:

  • Axon protocol allows anyone to create permissionless on-chain pools for incentivising torrent swarms to seed content.
  • An Axon pool consists of a set of tokens (the hosting rewards) and a custom distribution function, which are distributed to peers based on the reward matrix.
  • The reward matrix is computed by an off-chain entity called the aggregator.
    • The aggregator receives upload/download logs from BitTorrent peers and trackers. It combines these logs using ML to determine a reward distribution for a hosting epoch.
    • It then generates a ZK-STARK which proves this computation, and then posts it on-chain.

why not arweave/filecoin?

Axon complements them. Arweave/Filecoin are good at data storage, though aren't as performant as BitTorrent for data sharing. BitTorrent swarms, while making no guarantees about data storage, are much more scalable when it comes to sharing large data sets worldwide - since anyone can join as a server.

The end vision.

Buy hpos10i.eth, upload .html, pay $USDC for hosting - all in one dapp.

Technical proof-of-concept.

  • Modify webtorrent bittorrent-tracker and node:
    • log upload/downloads to disk (signed).
    • register public/private keypair for receiving rewards.
  • Aggregator
    • serve connections over libp2p, receive logs from nodes.
    • weight upload/download logs by uncertainty using ML/EBSL/EigenTrust (in particular, this ZK implementation by the EF's Privacy Scaling Explorations).
    • compute reward matrix from upload/download logs.
    • generate ZK proof of claim
    • submit on-chain
  • Protocol
    • create hosting pool for a single torrent hash
    • stake token rewards
    • claim rewards
  • Node
    • Automatically seed torrents for pools a user has joined.
  • Axon Desktop client
    • Runs a torrent client + axon node automatically.
    • Allows user to simply "join/leave" swarms for different content.
    • Allows user to hit one button to "claim" rewards.

dappnet-protocol's People

Contributors

liamzebedee avatar

Stargazers

 avatar Nick avatar system32.eth avatar Jani Anttonen avatar Shun Kakinoki avatar Rajiv Patel-O'Connor avatar Rob Morris avatar v1rtl avatar Beau avatar Loaf avatar

dappnet-protocol's Issues

Tokenomics

  • Data collectives
  • Nodes host data in group
  • Gate access w/ token
  • Users stake token for access, pay for download
  • Nodes earn token for sharing data
  • Nodes lose stake for unavailability (regular availability sampling)
  • Each node reports "hosted subset" i.e. files it hosts

Background

BitTorrent private trackers

  • A private tracker is a data sharing community, consisting of a private index of torrents and a private network of nodes which collectively host them according to "each to their own interest".
  • They're very useful - you can access fast downloads for stuff and there is a high-quality curated index of torrents
  • Nodes contribute hosting resources, which in turn give them reputation. Your seed ratio (upload:download ratio) dictates your membership of the community - ie. if you upload a lot, you can download anything from anyone. But if you download more than you upload, you get banned.
  • The interesting thing - nodes only host content they are interested in. There is no onus to host the entire dataset - this is probably impossible, since usually there is like >10TB of data. But due to overlapping interests, collectively people are able to host the entire dataset together.
  • Bootstrapping your reputation: you need to bootstrap your reputation. Usually users receive an invite from another user in order to just "join" the community and see torrents. And then to build up your reputation, you seed "freeleech" torrents - these are basically torrents that are known to be in demand by the wide community of the tracker, and you can easily earn rep for seeding. The only other alternative is buying "seed ratio" for like $10.

Proposal

A tokenised private tracker

What does a tokenised form of the above look like? Keep in mind, Arweave/Filecoin don't do the job, otherwise we might be using them instead of BitTorrent.

Example context:

  • data sharing community - let's imagine it's for large video game archives.
  • token - $VIDYA
  • user stories:
    • I want to download some old video games lost to time.
    • I want to earn tokens for hosting this data AND NOTHING ELSE (ie. strictly NOT like an Arweave/Filecoin node. You could run a node with as little as 5GB SSD space).
    • I want to buy data hosting using tokens on Uniswap.

A design proposal:

  • $VIDYA subnet / data sharing collective
  • Nodes host data in group
  • Gate access w/ token
  • Users stake token for access
    • Pay for download
      • Fixed price (at network rates)
      • Or maybe "freemium" allotment? e.g. unlimited internet w/ slowdown after 10GB
  • Nodes earn token for sharing data
  • Nodes lose stake for unavailability
    • Each node reports "hosted subset" ie. the stuff it's hosting
    • Regularly test that nodes are available to share this data (ie. liveness & data availability in one).
    • Token rewards are proportional to the % of the total dataset you host.

Supply-demand dynamics

Two open design routes:

  1. Fixed-price token gate, dynamic price download charge. You buy 50 $VIDYA to access dataset, then every 1GB is like 10 $VIDYA depending on the current network's saturation.
  2. Dynamic price token gate, fixed price download charge. You buy x $VIDYA to access dataset based on a fixed capacity of customers,

There is a counterintuitive fact here:

  • 1-to-1 data sharing is zero-sum. I send you data, you receive it, both incur cost.
  • 1-to-n-to-m data sharing is probably positive sum. Assuming that bandwidth costs are not universal, by sharing data to more nodes, we facilitate the bandwidth arb - those nodes who can upload cheaper and faster in their better jurisdiction, will add more capacity to the network at lower cost.

To keep the model simple, it might be hard to price "bandwidth" at the start. We can probably more easily just "inflate" by minting upload rewards

supply : upload cap
demand : download cap
earn tokens for upload

elastic surge pricing for download/upload
measure the surge for each torrent / datum -
1. saturation = total download demand next 15min / total upload capacity next 15min
2. incent higher upload rewards for "hot torrents", increase supply+availability

economic mechanisms:
- elastic supply-demand for upload-download.
- mechanism for incenting "geographically separate" replication / decentralization. ie. "reward boost +15%" if you are located >100km from current seed, or 1/3 of the uploaders for this file

like bitcoin:
- constant inflation rate
- token reward for running the network (upload)
like ethereum:
- stake
- slash for not performing role (hosting, unavailable)
like celestia:
- it's cool to run a node
like blah:
- $vidya is for short-term - paying to get at cool data
- $vidya is for long-term - betting on a data network becoming really cool and in demand. how?
  - anyone can join as a data uploader and earn $vidya
  - $vidya fees (from joiners) are redistributed to stakers on a longer-term curve
  - incent longer-term node set by adjusting these distribution curves (like synthetix, 1yr lockup or sth, power law not linear)
  - constant inflation rate for the vidya hosting set

product vibes:
- earn tokens for hosting the llama model. permanent poap if you want.
- I want users to hold these tokens because they're fucking COOL. and speculate on them. like $bitcoin (the real one), and $tia
- run your own node without needing to buy the token. e.g. synthetix runs its own hosting nodes for data.

I want two mechanisms:
1. nodes retain data they're interested in
2. nodes retain data they believe has future interest to others (for profit, speculation)



Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.