Code Monkey home page Code Monkey logo

py_ecc's Introduction

py_ecc

Join the conversation on Discord Build Status PyPI version Python versions

Elliptic curve crypto in python including secp256k1, alt_bn128, and bls12_381.

Warning: This library contains some experimental codes that have NOT been audited.

Read more in the documentation below. View the change log

Quickstart

python -m pip install py_ecc

BLS Signatures

py_ecc implements the IETF BLS draft standard v4 as per the inter-blockchain standardization agreement. The BLS standards specify different ciphersuites which each have different functionality to accommodate various use cases. The following ciphersuites are available from this library:

  • G2Basic also known as BLS_SIG_BLS12381G2_XMD:SHA-256_SSWU_RO_NUL_
  • G2MessageAugmentation also known as BLS_SIG_BLS12381G2_XMD:SHA-256_SSWU_RO_AUG_
  • G2ProofOfPossession also known as BLS_SIG_BLS12381G2_XMD:SHA-256_SSWU_RO_POP_

Basic Usage

from py_ecc.bls import G2ProofOfPossession as bls_pop

private_key = 5566
public_key = bls_pop.SkToPk(private_key)

message = b'\xab' * 32  # The message to be signed

# Signing
signature = bls_pop.Sign(private_key, message)

# Verifying
assert bls_pop.Verify(public_key, message, signature)

Aggregating Signatures

private_keys = [3, 14, 159]
public_keys = [bls_pop.SkToPk(key) for key in private_keys]
signatures = [bls_pop.Sign(key, message) for key in private_keys]

# Aggregating
agg_sig = bls_pop.Aggregate(signatures)

# Verifying signatures over the same message.
# Note this is only safe if Proofs of Possession have been verified for each of the public keys beforehand.
# See the BLS standards for why this is the case.
assert bls_pop.FastAggregateVerify(public_keys, message, agg_sig)

Multiple Aggregation

messages = [b'\xaa' * 42, b'\xbb' * 32, b'\xcc' * 64]
signatures = [bls_pop.Sign(key, message) for key, message in zip(private_keys, messages)]
agg_sig = bls_pop.Aggregate(signatures)

# Verify aggregate signature with different messages
assert bls_pop.AggregateVerify(public_keys, messages, agg_sig)

Developer Setup

If you would like to hack on py_ecc, please check out the Snake Charmers Tactical Manual for information on how we do:

  • Testing
  • Pull Requests
  • Documentation

We use pre-commit to maintain consistent code style. Once installed, it will run automatically with every commit. You can also run it manually with make lint. If you need to make a commit that skips the pre-commit checks, you can do so with git commit --no-verify.

Development Environment Setup

You can set up your dev environment with:

git clone [email protected]:ethereum/py_ecc.git
cd py_ecc
virtualenv -p python3 venv
. venv/bin/activate
python -m pip install -e ".[dev]"
pre-commit install

Release setup

To release a new version:

make release bump=$$VERSION_PART_TO_BUMP$$

How to bumpversion

The version format for this repo is {major}.{minor}.{patch} for stable, and {major}.{minor}.{patch}-{stage}.{devnum} for unstable (stage can be alpha or beta).

To issue the next version in line, specify which part to bump, like make release bump=minor or make release bump=devnum. This is typically done from the main branch, except when releasing a beta (in which case the beta is released from main, and the previous stable branch is released from said branch).

If you are in a beta version, make release bump=stage will switch to a stable.

To issue an unstable version when the current version is stable, specify the new version explicitly, like make release bump="--new-version 4.0.0-alpha.1 devnum"

py_ecc's People

Contributors

6ug avatar bhargavasomu avatar carlbeek avatar carver avatar cburgdorf avatar chihchengliang avatar davesque avatar fselmo avatar hwwhww avatar iamonuwa avatar kclowes avatar kirk-baird avatar njgheorghita avatar pacrob avatar pipermerriam avatar ralexstokes avatar reedsa avatar rustfix avatar tmckenzie51 avatar vbuterin avatar wemeetagain avatar wolovim avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

py_ecc's Issues

Verify Signatures are in the correct Sub-group

  • py-ecc Version: 1.7.1

What is wrong?

Currently verification of a signature does not check that the signature is in the correct subgroup. This poses a security risk when verifying a pairing.

How can it be fixed

When verifying a signature check r * sig == 0 before pairing.

I will update #79 to include the check as this had been inserted into the standard.

Wrapper for BLS python bindings in Chia Network

What is wrong?

To address #66, we can make use of faster implementations. Chia-Network has BLS signature implementation in C++, as well as its bindings for Python.

How can it be fixed

To make use of the bindings, we need a wrapper for the essential functions in bls.api

  • sign
  • privtopub
  • verify
  • aggregate_signatures
  • aggregate_pubkeys
  • aggregate_multiple

Possible concerns

Chia-Network/bls-signatures is under the license Apache 2.0. From my understanding, since we only import the library, from the rules in Apache 2.0, it seems not to affect our license. Is it correct? I don't have enough experience in the licenses. Please correct me if I'm wrong.

Unification of interfaces across secp256k1 and bls

What is wrong?

It would be nice if py_ecc.secpk256k1, py_ecc.bls12_381 and py_ecc.bn128 all had roughly the same API.

How can it be fixed

A combination of exposing a few un-exposed APIs from secpk256k1 and renaming things.

  • expose add and multiply from py_ecc.secp256k1.secp256k1 from py_ecc.secp256k1
  • I think py_ecc.secp256k1.secp256k1.inv needs to be renamed/aliased to neg to line up with the bls/bn APIs.
  • some other things like field_modulus and curve_order might need to have their equivalent values exposed from secp256k1

BLS optimization

What is wrong?

There's a need to optimize BLS verifying process. Here lists some possible actions items.

  • Optimize through the math part: Vitalik has a short cut for the pairing and final exponentialization
  • Profiling to identify the hotspot
  • Some function rewrite options (from the offline discussion of @hwwhww and @changwu-tw )
    • Cython, see #50
    • gmp
    • sage: only py2 support 😢
    • Numba

How can it be fixed

Pairing product function which matches Ethereum opcode (EIP-212)

def pairingProd(*inputs):
    """
    The Ethereum pairing opcode works like:

       e(p1[0],p2[0]) * ... * e(p1[n],p2[n]) == 1

    See: EIP 212

    >>> assert True == pairingProd((G1, G2), (G1, neg(G2)))
    """
    product = FQ12.one()
    for p1, p2 in inputs:
        product *= pairing(p2, p1)
    return product == FQ12.one()

The G1 and G2 argument orders are swapped compared to the Pairing.sol file, whereas py_ecc uses e(G2, G1).

It would be useful to have this included in the library, and to verify this is a correct implementation.

Test secp256k1 against coincurve

Given that coincurve is the most reputable implementation of the ECC operations, it would be good to implement some fuzz testing against coincurve.

Expose type hints via PEP561

blocked until #11 is complete

What was wrong?

PEP561 allows us to expose type hints from this library to other libraries which use it. Without this, any library which uses py_ecc and mypy style type checking doesn't benefit from the additional type safety.

Definition of done?

Do what was done in ethereum/eth-typing#10 to allow this data to be exposed when this library is installed as a package.

Move to SHA256

  • py-evm Version: 1.6.0
  • OS: osx/linux/win
  • Python Version (python --version):
  • Environment (output of pip freeze):

What is wrong?

ETH2.0 is using SHA256 instead of Keccak256 since v0.6.0 following this PR ethereum/consensus-specs#779
Py-ecc is still using Keccak256: https://github.com/ethereum/py_ecc/blob/master/py_ecc/bls/hash.py
Py-ecc is used in BLS test generators etc.

How can it be fixed

By changing hashing algo to SHA256 + new version release, so it could be used as dependency.

Benchmarking the Library

What is wrong?

I personally think that it would be better if there were some benchmarks added to this library, so that to make sure that we are not altering any optimizations made etc. while adding new functionalities. Further this is a pretty primitive module where I believe speed matters.

How can it be fixed

I think we could something similar to what we are doing here

[DISCUSSION] Library Upgrade - Part 2

What is wrong?

The first part of the upgrade is over, where the fields used by different curves were generalized into the fields API. Regarding the 2nd part of the upgrade, this is what I had in mind.

  • Remove different modules for the different curves (bn128, bls12_381, optimized_bn128, optimized_bls12_381) and generalize it into the curves API.
  • In the curves API, all the common functionalities amongst all the above mentioned curves go into the class BaseCurve and class OptimizedBaseCurve. And the respective curves would inherit these base classes.
  • All the non-common functionalities go into each subcurve implementation.
  • We could then create an object of each of the curve, and place it in __init__.py so that users can directly import the respective curve object.
  • This could be considered as a Breaking API for the following reason. There are 2 scenarios regarding how the users import and use this library.

Scenario 1

from py_ecc import bn128
...
...
a = bn128.G1  # Some operation involving G1
b = bn128.G2  # Some operation involving G2
c = bn128.is_on_curve  # Some operation involving is_on_curve function
...
...

Scenario 2

from py_ecc.bn128 import (
    G1,
    G2,
    Z1,
    Z2,
   is_on_curve,
   ...
   ...
)
...
...
a = G1  # Some operation involving G1
b = G2  # Some operation involving G2
c = is_on_curve()  # Some operation involving is_on_curve function
...
...

Here, Scenario 1 won't be a breaking API, but Scenario 2 would be a breaking API for the further releases.

How can it be fixed

/cc @pipermerriam @carver @ChihChengLiang @hwwhww

Generalize fields and curves

Currently, there are separate folders and files for bn128 and bls-12-381, with ~90% duplicated code.

It should be possible to generalize this more, creating a generic prime field class much like there is a semi-generic FQP class, as well as generic elliptic curves, and then only have separate files for the two existing curve types to define curve parameters and twist formulas and the slightly different pairing functions.

Modernize project

This project was made without the use of our standard project template.

https://github.com/carver/ethereum-python-project-template

The following should be cleaned up to closer match our modern project structure.

  • remove requirements-dev.txt in favor of using extras_require pattern.
  • add classifiers to setup.py
  • update the readme to contain the appropriate badgest, title and general structure.
  • use the long_description_content_type as described here: pypa/setuptools#1236
  • update the tox.ini flake8 environment to use the more modern lint name as well as the same approach, installing from extras_require.
  • change licence to be the string MIT
  • add python_requires (and exclude python==3.5.2)
  • add Makefile and bumpfile based version management.

When install the py_ecc by pip, it raises error: 'error: README.md: No such file or directory'

Platform: MacOS 10.12.5

OutPut:

~/work/pyethereum(develop) $ pip install py_ecc
Collecting py_ecc
/Users/terrytai/.pyenv/versions/2.7.6/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/util/ssl_.py:318: SNIMissingWarning: An HTTPS request has been made, but the SNI (Subject Name Indication) extension to TLS is not available on this platform. This may cause the server to present an incorrect TLS certificate, which can cause validation failures. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/security.html#snimissingwarning.
  SNIMissingWarning
/Users/terrytai/.pyenv/versions/2.7.6/lib/python2.7/site-packages/pip/_vendor/requests/packages/urllib3/util/ssl_.py:122: InsecurePlatformWarning: A true SSLContext object is not available. This prevents urllib3 from configuring SSL appropriately and may cause certain SSL connections to fail. You can upgrade to a newer version of Python to solve this. For more information, see https://urllib3.readthedocs.io/en/latest/security.html#insecureplatformwarning.
  InsecurePlatformWarning
  Using cached py_ecc-1.1.0.tar.gz
    Complete output from command python setup.py egg_info:
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/private/var/folders/gd/2jkp088j3d96slmx_5gllmc40000gn/T/pip-build-vmYHRm/py-ecc/setup.py", line 6, in <module>
        with open('README.md') as f:
    IOError: [Errno 2] No such file or directory: 'README.md'

    ----------------------------------------
Command "python setup.py egg_info" failed with error code 1 in /private/var/folders/gd/2jkp088j3d96slmx_5gllmc40000gn/T/pip-build-vmYHRm/py-ecc/

I found the setup.py try to open README.md and LICENSE, but the two files are not in the package: https://pypi.python.org/pypi/py_ecc

For arbitrary message, is empty signature valid for empty public key?

What is wrong?

To py_ecc, if you sign a message with an empty public key and an empty signature, it verifies True. That seems wrong.

In [10]: from py_ecc.bls import verify

In [11]: verify(b'\x56'*32, b'\xc0'+ b'\x00'*47, b'\xc0'+ b'\x00'*95, 0)
Out[11]: True

Arguments to return True

  • It is a valid behavior for the pairing function
  • It is just a special case of rogue key attack, we don't let validator input public key anyway.

Arguments to return False

  • It's a scary gotcha to let an arbitrary message to be verified True, if not handled carefully.
  • In other libraries like BLS Chia, this is not allowed.

How can it be fixed

Add Python 3.8 env

What is wrong?

Add Python 3.8 env to the CI setting.

How can it be fixed

Fill this in if you know how to fix it.

secp256k1.py fails on Python 3.4

Error test:

import binascii
import rlp
# sha3 from module `pysha3` not `ssh3`
import sha3
from py_ecc.secp256k1 import ecdsa_raw_recover

n = 0
p = 20000000000
g = 100000
v = 1000
Tn = ''
Tp = p.to_bytes((p.bit_length()//8) + 1,byteorder='big')
Tg = g.to_bytes((g.bit_length()//8) + 1,byteorder='big')
Tt = binascii.unhexlify("687422eEA2cB73B5d3e242bA5456b782919AFc85")
Tv = v.to_bytes((v.bit_length()//8) + 1,byteorder='big')
Td = binascii.unhexlify("c0de")
transaction = [Tn, Tp, Tg, Tt, Tv, Td]
rlp_data=rlp.encode(transaction)
unsigned_message=sha3.keccak_256(rlp_data).hexdigest()
v = 28
r = int("5897c2c7c7412b0a555fb6f053ddb6047c59666bbebc6f5573134e074992d841",16)
s = int("1c71d1c62b74caff8695a186e2a24dd701070ba9946748318135e3ac0950b1d4",16)
ecdsa_raw_recover(unsigned_message, (v, r, s))

Error message:

Traceback (most recent call last):
File "", line 1, in
File "/home/apalau/python3.4/lib64/python3.4/site-packages/py_ecc/secp256k1/secp256k1.py", line 132, in ecdsa_raw_recover
z = bytes_to_int(msghash)
File "/home/apalau/python3.4/lib64/python3.4/site-packages/py_ecc/secp256k1/secp256k1.py", line 21, in bytes_to_int
o = (o << 8) + safe_ord(b)
TypeError: unsupported operand type(s) for +: 'int' and 'str'

On Python 2.7 the same ecdsa_raw_recover(unsigned_message, (v, r, s)) works well.

Python version:

python --version
Python 3.4.5

[EXPERIMENT] Fields as subclass of int

What is wrong?

Currently, the FQ and FQP fields have a lot of isinstance checks. This can be eradicated by making the classes subclasses of int.

Reference Comment - #41 (comment)

How can it be fixed

Fill this in if you know how to fix it.

py-evm tests will fail with current master branch of py_ecc

  • py-evm Version: master branch at d6adf9039647b6fc39b9d79725e9331008a5c438
  • OS: osx
  • Python Version (python --version): Python 3.7.0
  • Environment (output of pip freeze):

What is wrong?

Please include information like:

  • full output of the error you received
  • what command you ran: (in py-evm)$ pytest tests/json-fixtures/test_blockchain.py::test_blockchain_fixtures[.../py-evm/fixtures/BlockchainTests/GeneralStateTests/stCreate2/create2callPrecompiles_d5g0v0.json:create2callPrecompiles_d5g0v0_Constantinople:Constantinople] replace ... with your absolute path
  • the code that caused the failure: py_ecc.optimized_bn128.optimized_field_elements.FQ

tests/json-fixtures/test_blockchain.py F                                                                                             [100%]

================================================================= FAILURES =================================================================
 test_blockchain_fixtures[py-evm/fixtures/BlockchainTests/GeneralStateTests/stCreate2/create2callPrecompiles_d5g0v0.json:create2callPrecompiles_d5g0v0_Constantinople:Constantinople] 

fixture_data = ('py-evm/fixtures/BlockchainTests/GeneralStateTests/stCreate2/create2callPrecompiles_d5g0v0.json', 'create2callPrecompiles_d5g0v0_Constantinople', 'Constantinople')
fixture = {'blocks': [{'blockHeader': {'bloom': 0, 'coinbase': b'*\xdc%fP\x18\xaa\x1f\xe0\xe6\xbcfm\xac\x8f\xc2i\x7f\xf9\xba', '... 'lastblockhash': b'\xd3\x92\r\x904,\x0f\t\xaex$f/\x964\xd7\x8d\x8d\xfb\xf95\xdfK\x0f\x04\n\x81\xa5\x0c\xa2g\x95', ...}

    def test_blockchain_fixtures(fixture_data, fixture):
        try:
            chain = new_chain_from_fixture(fixture)
        except ValueError as e:
            raise AssertionError("could not load chain for {}".format((fixture_data,))) from e
    
        genesis_params = genesis_params_from_fixture(fixture)
        expected_genesis_header = BlockHeader(**genesis_params)
    
        # TODO: find out if this is supposed to pass?
        # if 'genesisRLP' in fixture:
        #     assert rlp.encode(genesis_header) == fixture['genesisRLP']
    
        genesis_block = chain.get_canonical_block_by_number(0)
        genesis_header = genesis_block.header
    
        assert_imported_genesis_header_unchanged(expected_genesis_header, genesis_header)
    
        # 1 - mine the genesis block
        # 2 - loop over blocks:
        #     - apply transactions
        #     - mine block
        # 3 - diff resulting state with expected state
        # 4 - check that all previous blocks were valid
    
        mined_blocks = list()
        for block_fixture in fixture['blocks']:
            should_be_good_block = 'blockHeader' in block_fixture
    
            if 'rlp_error' in block_fixture:
                assert not should_be_good_block
                continue
    
            if should_be_good_block:
                (block, mined_block, block_rlp) = apply_fixture_block_to_chain(
                    block_fixture,
                    chain,
>                   perform_validation=False  # we manually validate below
                )

block_fixture = {'blockHeader': {'bloom': 0, 'coinbase': b'*\xdc%fP\x18\xaa\x1f\xe0\xe6\xbcfm\xac\x8f\xc2i\x7f\xf9\xba', 'difficulty':...x00a\x01\x00`\x00`\x00`\x06b\t'\xc0\xf1`\x02U`\x00Q`\x00U\x00", 'gasLimit': 15000000, 'gasPrice': 1, 'nonce': 0, ...}]}
chain      = <abc.ChainFromFixture object at 0x10d834278>
expected_genesis_header = <BlockHeader #0 8a27153c>
fixture    = {'blocks': [{'blockHeader': {'bloom': 0, 'coinbase': b'*\xdc%fP\x18\xaa\x1f\xe0\xe6\xbcfm\xac\x8f\xc2i\x7f\xf9\xba', '... 'lastblockhash': b'\xd3\x92\r\x904,\x0f\t\xaex$f/\x964\xd7\x8d\x8d\xfb\xf95\xdfK\x0f\x04\n\x81\xa5\x0c\xa2g\x95', ...}
fixture_data = ('py-evm/fixtures/BlockchainTests/GeneralStateTests/stCreate2/create2callPrecompiles_d5g0v0.json', 'create2callPrecompiles_d5g0v0_Constantinople', 'Constantinople')
genesis_block = <ConstantinopleBlock(#Block #0)>
genesis_header = <BlockHeader #0 8a27153c>
genesis_params = {'block_number': 0, 'bloom': 0, 'coinbase': b'*\xdc%fP\x18\xaa\x1f\xe0\xe6\xbcfm\xac\x8f\xc2i\x7f\xf9\xba', 'difficulty': 131072, ...}
mined_blocks = []
should_be_good_block = True

tests/json-fixtures/test_blockchain.py:269: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
eth/tools/fixtures/helpers.py:232: in apply_fixture_block_to_chain
    mined_block, _, _ = chain.import_block(block, perform_validation=perform_validation)
eth/chains/base.py:726: in import_block
    imported_block = self.get_vm(base_header_for_import).import_block(block)
eth/vm/base.py:539: in import_block
    new_header, receipts, _ = self.apply_all_transactions(block.transactions, self.block.header)
eth/vm/base.py:492: in apply_all_transactions
    transaction,
eth/vm/base.py:408: in apply_transaction
    state_root, computation = self.state.apply_transaction(transaction)
eth/vm/state.py:256: in apply_transaction
    computation = self.execute_transaction(transaction)
eth/vm/forks/frontier/state.py:201: in execute_transaction
    return executor(transaction)
eth/vm/state.py:307: in __call__
    computation = self.build_computation(message, valid_transaction)
eth/vm/forks/frontier/state.py:133: in build_computation
    transaction_context,
eth/vm/forks/spurious_dragon/computation.py:33: in apply_create_message
    computation = self.apply_message()
eth/vm/forks/frontier/computation.py:76: in apply_message
    self.transaction_context,
eth/vm/computation.py:583: in apply_computation
    opcode_fn(computation=computation)
eth/vm/logic/system.py:215: in __call__
    return super().__call__(computation)
eth/vm/logic/system.py:193: in __call__
    self.apply_create_message(computation, child_msg)
eth/vm/logic/system.py:252: in apply_create_message
    child_computation = computation.apply_child_computation(child_msg)
eth/vm/computation.py:395: in apply_child_computation
    child_computation = self.generate_child_computation(child_msg)
eth/vm/computation.py:404: in generate_child_computation
    self.transaction_context,
eth/vm/forks/spurious_dragon/computation.py:33: in apply_create_message
    computation = self.apply_message()
eth/vm/forks/frontier/computation.py:76: in apply_message
    self.transaction_context,
eth/vm/computation.py:583: in apply_computation
    opcode_fn(computation=computation)
eth/vm/logic/call.py:136: in __call__
    child_computation = computation.apply_child_computation(child_msg)
eth/vm/computation.py:395: in apply_child_computation
    child_computation = self.generate_child_computation(child_msg)
eth/vm/computation.py:410: in generate_child_computation
    self.transaction_context,
eth/vm/forks/frontier/computation.py:76: in apply_message
    self.transaction_context,
eth/vm/computation.py:569: in apply_computation
    computation.precompiles[message.code_address](computation)
eth/precompiles/ecadd.py:35: in ecadd
    result = _ecadd(computation.msg.data_as_bytes)
eth/precompiles/ecadd.py:59: in _ecadd
    p1 = validate_point(x1, y1)
eth/_utils/bn128.py:26: in validate_point
    if not bn128.is_on_curve(p1, bn128.b):
../py_ecc/py_ecc/optimized_bn128/optimized_curve.py:69: in is_on_curve
    return y**2 * z - x**3 == b * z**3
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = 3, other = 1

    def __mul__(self, other: IntOrFQ) -> "FQ":
        if isinstance(other, FQ):
            on = other.n
        elif isinstance(other, int):
            on = other
        else:
            raise TypeError(
                "Expected an int or FQ object, but got object of type {}"
>               .format(type(other))
            )
E           TypeError: Expected an int or FQ object, but got object of type <class 'py_ecc.optimized_bn128.optimized_field_elements.FQ'>

other      = 1
self       = 3

../py_ecc/py_ecc/fields/optimized_field_elements.py:67: TypeError
========================================================= 1 failed in 8.43 seconds =========================================================

How can it be fixed

  • might need a separate branch for the release

Library refactor

By @Bhargavasomu from: #24 (comment)

The following things come to my mind.

  • The classes FQ, FQP, FQ2 and FQ12 need not be reinitialized every time as they are not dependent on the type of curve or extension. So probably we could have these created in field_elements.py and we could use them everywhere (bn128, optimized_bn128, bls, optimized_bls).
  • We could also have a general class BaseCurve, and then maybe every curve (bn128_curve, optimized_bn128_curve, ...) could inherit this and make the changes specific to the inherited class.
  • We should probably move the constants into a seperate file (constants.py)
  • We should also remove the assert statements which are not part of any function, but are part of the script in general, as shown
    assert linefunc(one, two, one) == FQ(0)
    assert linefunc(one, two, two) == FQ(0)
    assert linefunc(one, two, three) != FQ(0)
    assert linefunc(one, two, negthree) == FQ(0)
    assert linefunc(one, negone, one) == FQ(0)
    assert linefunc(one, negone, negone) == FQ(0)
    assert linefunc(one, negone, two) != FQ(0)
    assert linefunc(one, one, one) == FQ(0)
    assert linefunc(one, one, two) != FQ(0)
    assert linefunc(one, one, negtwo) == FQ(0)
  • Also the type hinting should be further generalized wherever possible (in terms of removing redundant types; like Optimized_FQPoint2D could be replaced by FQPoint2D). Similary the type hinting should be carried out for the bs12_381 and optimized_bs12_381 submodules.

Also the only difference I see in all the curves is

  • Difference in the constants such as b, b2, b12, G2, G12 ...
  • Difference in the twist function

@vbuterin are my facts right or am I missing anything.
@pipermerriam is the above design ok?

Fix BLS Signature byte representation of G1 and G2 points

What is wrong?

In the BLS Signature, points in G1 and G2 consist of 381 bits integers. An integer should be converted to a total 384 bits byte format with extra specified 3 bits flags, while it is converted with simple pt.to_bytes(48, "big") in the current implementation.

How can it be fixed

  • Fix G1_to_pubkey, pubkey_to_G1, G2_to_signature, and signature_to_G2
  • Should fix in #52

Add FQP.copy()

For a follow-up PR: I think this could be made a tiny bit more readable by adding a copy() method that takes the new coefficients, turning the line into:

return self.copy([x + y for x, y in  zip(self.coeffs, other.coeffs)])

From the current:

return type(self)([x + y for x, y in  zip(self.coeffs, other.coeffs)], self.curve_name)

Originally posted by @carver in #41

Cleanup isinstance checks for integers to disallow booleans

What is wrong?

Lots of places in the codebase do assertions like isinstance(v, int) to check that something is an integer. Since python considers True and False to be integer types these checks won't disallow passing in a bool value.

How can it be fixed

Probably need to add a single utility and make use of it everywhere that we do these checks.

def assert_strictly_integer(v):
    assert isinstance(v, int) and not isinstance(v, bool)

Support for python 3.7

Add support for python 3.7

  • Add test run to tox.ini
  • Add test run to .travis.yml
  • Fix any issues that show up in tests.

Question: Show work for BLS12-381 Fq12 modulus

I'm learning more about bls12-381 and currently stumped by one part of the code:
I'm assuming that

FQ12_modulus_coeffs = (2, 0, 0, 0, 0, 0, -2, 0, 0, 0, 0, 0)  # Implied + [1]

means x12 -2x6+2
and that it is the irreducable polynomial used as a modulus for Fq12.

I feel like this had to be from the specification https://github.com/zkcrypto/pairing/tree/master/src/bls12_381#bls12-381-instantiation , but I don't see how it was done.

I feel pretty dense with this one, but I'm hoping that someone more knowledgable can show the work on how the above polynomial was created.

Edit: If this is off-topic for this repo, feel free to close this issue

[EXPERIMENT] Caching classmethods one and zero

What is wrong?

Caching the outputs of the classmethods one and zero of the classes FQ and FQP may increase the performance

It would ideally be better to do this after #49 is done with, so as to check the performance gains.

How can it be fixed

Fill this in if you know how to fix it.

Improving performance of the Library with Cython

What is wrong?

There are many places in the code like, finding an intersection of the tangent with the curve etc. In other words, there are many functions whose variables make use of fixed specific datatypes (mostly int). I feel that Cythonizing these functions would improve the performance.

Maybe tackle this after we have benchmarks for the library.

How can it be fixed

Make the suitable functions as cython functions.

Constant time hashing to the curve

What is wrong?

For bls12-381 we want to map messages to points on the curve. The current implementation uses an expensive "hash and check" method via hash_to_g2. It is desirable to have a more performant method of mapping a given message to the curve.

How can it be fixed

Luckily, there was recent work on a constant time hashing method w/ implementation here:

https://github.com/kwantam/bls12-381_hash

There are some BLS standardization efforts under way that will likely involve making this method the canonical "hash to the curve". If this method makes its way into the standard, then we will definitely want to include it.

The linked repo has a C implementation and a pure Python implementation. We can add both, in the appropriate places, to get the functionality we want.

Add type hinting

Largely copy/pasted from ethereum/py-evm#1398

Background

Type hints allow us to perform static type checking, among other things. They raise the security bar by catching bugs at development time that, without type support, may turn into runtime bugs.

This stackoverflow answer does a great job at describing their main benefits.

What is wrong?

This library currently does not have any type hints.

This needs to be fixed by:

  1. Adding all missing type hints.
  2. Enforcing (stricter) type checking in CI

How

There does exist tooling (monkeytype) to the generation of type hints for existing code bases. From my personal experience monkeytype can be helpful but does still require manual fine tuning. Also, manually adding these type hints does serve as a great boost to the general understanding of the code base as it forces one to think about the code.

  1. Run mypy --follow-imports=silent --warn-unused-ignores --ignore-missing-imports --no-strict-optional --check-untyped-defs --disallow-incomplete-defs --disallow-untyped-defs --disallow-any-generics -p eth

  2. Eliminate every reported error by adding the right type hint

Because this library supports older versions of python, the type hints will not be able to use the modern python3.6 syntax.

Definition of done

This issue is done when the following criteria are met:

  1. mypy is run in CI

Add a new command to the flake8 environment in the tox.ini file that runs:

mypy --follow-imports=silent --warn-unused-ignores --ignore-missing-imports --no-strict-optional --check-untyped-defs --disallow-incomplete-defs --disallow-untyped-defs --disallow-any-generics -p py_ecc`
  1. Usage of type: ignore (silencing the type checker) is minimized and there's a reasonable explanation for its usage

Stretch goals

When this issue is done, stretch goals can be applied (and individually get funded) to tighten type support to qualify:

  1. mypy --strict --follow-imports=silent --ignore-missing-imports --no-strict-optional -p py_ecc

Look into ways to cross-verify implementation

What is wrong?

#24 (comment)

This is a little frightening (no blame intended). It would be great if we could come up with a way to cross check this implementation with a known good one or at least to have some additional static test vectors we could test against.

How can it be fixed

Look into what other implementations exist and how we could use them in a CI environment or manually extract some test vectors from them.

Error during pip installation

Environment

Python 3.6.1

pip list
Package    Version
---------- -------
pip        18.1
setuptools 28.8.0

What is wrong?

Please include information like:

command:

pip install py_ecc==1.4.5

log:

Collecting py_ecc==1.4.5
Exception:
Traceback (most recent call last):
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2883, in _dep_map
    return self.__dep_map
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2677, in __getattr__
    raise AttributeError(attr)
AttributeError: _DistInfoDistribution__dep_map

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_vendor/packaging/requirements.py", line 93, in __init__
    req = REQUIREMENT.parseString(requirement_string)
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_vendor/pyparsing.py", line 1654, in parseString
    raise exc
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_vendor/pyparsing.py", line 1644, in parseString
    loc, tokens = self._parse( instring, 0 )
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_vendor/pyparsing.py", line 1402, in _parseNoCache
    loc,tokens = self.parseImpl( instring, preloc, doActions )
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_vendor/pyparsing.py", line 3417, in parseImpl
    loc, exprtokens = e._parse( instring, loc, doActions )
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_vendor/pyparsing.py", line 1406, in _parseNoCache
    loc,tokens = self.parseImpl( instring, preloc, doActions )
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_vendor/pyparsing.py", line 3205, in parseImpl
    raise ParseException(instring, loc, self.errmsg, self)
pip._vendor.pyparsing.ParseException: Expected stringEnd (at char 7), (line:1, col:8)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2963, in __init__
    super(Requirement, self).__init__(requirement_string)
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_vendor/packaging/requirements.py", line 96, in __init__
    requirement_string[e.loc:e.loc + 8], e.msg
pip._vendor.packaging.requirements.InvalidRequirement: Parse error at "'(===3.2.'": Expected stringEnd

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_internal/cli/base_command.py", line 143, in main
    status = self.run(options, args)
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_internal/commands/install.py", line 318, in run
    resolver.resolve(requirement_set)
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_internal/resolve.py", line 102, in resolve
    self._resolve_one(requirement_set, req)
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_internal/resolve.py", line 306, in _resolve_one
    set(req_to_install.extras) - set(dist.extras)
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2840, in extras
    return [dep for dep in self._dep_map if dep]
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2885, in _dep_map
    self.__dep_map = self._compute_dependencies()
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2895, in _compute_dependencies
    reqs.extend(parse_requirements(req))
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2956, in parse_requirements
    yield Requirement(line)
  File "/Users/hwwang/.pyenv/versions/3.6.1/envs/pyeth36b/lib/python3.6/site-packages/pip/_vendor/pkg_resources/__init__.py", line 2965, in __init__
    raise RequirementParseError(str(e))
pip._vendor.pkg_resources.RequirementParseError: Parse error at "'(===3.2.'": Expected stringEnd

How can it be fixed

Fill this in if you know how to fix it.

BLS point (de)compression issue

What is wrong?

Follow up with #107, to see what's wrong with the (de)compression.

History

The implementation and the given test case

  • The flags of "normal" point at infinity PK: 110
  • The flags of the #107 case PK: 010

The decompress_G1 function:

def decompress_G1(z: G1Compressed) -> G1Uncompressed:
"""
Recovers x and y coordinates from the compressed point.
"""
# b_flag == 1 indicates the infinity point
b_flag = (z % POW_2_383) // POW_2_382
if b_flag == 1:
return Z1
x = z % POW_2_381
# Try solving y coordinate from the equation Y^2 = X^3 + b
# using quadratic residue
y = pow((x**3 + b.n) % q, (q + 1) // 4, q)
if pow(y, 2, q) != (x**3 + b.n) % q:
raise ValueError(
"The given point is not on G1: y**2 = x**3 + b"
)
# Choose the y whose leftmost bit is equal to the a_flag
a_flag = (z % POW_2_382) // POW_2_381
if (y * 2) // q != a_flag:
y = q - y
return (FQ(x), FQ(y), FQ(1))

I think decompress_G1 should have checked c_flag (c_flag == 1 in decompress_G1). So do other (de)compress functions.

/cc @ChihChengLiang @CarlBeek could you 👀 sanity check it?

Use custom object for infinity point instead of `None`

What is wrong?

For #89, discussion offline:

If I call mypy --strict --follow-imports=silent --ignore-missing-imports py_ecc (without --no-strict-optional), then 50+ errors about "None" infinity point would show up

If I call the same command with your PR, the errors:

py_ecc/bn128/bn128_curve.py:63: error: 'None' object is not iterable
py_ecc/bn128/bn128_curve.py:75: error: 'None' object is not iterable
py_ecc/bls12_381/bls12_381_curve.py:66: error: 'None' object is not iterable
py_ecc/bls12_381/bls12_381_curve.py:78: error: 'None' object is not iterable

But it's already handled with if is_inf(pt): return pt

None is really tricky in typing, perhaps there should be a special constant object for infinity point?

How can it be fixed

Produces G1 and G2 points of different length.

So I tried to verify signature and public output from
Line 273 : threshold_sig ::

But G1 and G2 points I received were of length 115 not 77( see first link)

So I was unable to verify Signature

signature = "85b95df4656df8ef8d9a7d463073cd3829f0e17fc44c6f1fc65587536fcb5307e35a3dcc4916f42d747426c3ca47bfe90ce1fa2fddb19b7f31985ef6ecc1bb897d3d2b25d7aa6b26b5dbe8b82cdccfa3bae56ae467c898e587fbb8ae40a40c6c"
public_key = "b4788b55a14a67f9a44657f86583118b335693e5bed02038e9b886608657b60cfd53125343a543fe0aa8d9e6c826c2dfab749a6747438d945160f81cc3253ee8f10fabb860234d195c008ad7762e9c01456fc694112374684ceaf0668b423fb4a29e7b6e922d204ab83dc517ab5da26f7eef17a3052b0936921659f36af77c0346a2df3dac9e0c0732b5921162e5ecc68a14562777d3b0941d4a09cb5a975c078698571e8e34280c28310c74fca1944b2ddcc552358d891b1b38ba722e45503d"

Used Functions

https://github.com/ethereum/py_ecc/blob/master/py_ecc/bls/g2_primitives.py
 pubkey_to_G1
 signature_to_G2

Fails test vectors from hash_to_curve v7

See hash_to_curve spec version 7, specifically msg=abc.

Expected

It should produce (stripping hex to 4 chars):

u[0]    = 0a7d + I * 1630
u[1]    = 0a1c + I * 07aa
P.x     = 1953 + I * 0357
P.y     = 0882 + I * 0184

Actual

u0 and u1 are correct, everything else is not.

hash_to_curve u0=(0a7d...55f2 + 1630...b378×i), u1=(0a1c...f96b + 07aa...b788×i)
Q0 Point<x=0265...f2c4 + 1957...bd53×i, y=0824...2918 + 0d1a...9589×i>
Q1 Point<x=0368...80fe + 156d...19b1×i, y=0cda...087e + 0cc7...c12e×i>
R Point<x=07e2...67a9 + 04cf...29ec×i, y=13db...0fd6 + 12b9...364a×i>
Point<x=0a4a...ab29 + 073f...766e×i, y=0e92...6bd4 + 1695...13eb×i>

Test code

(copy-paste helper methods to hash_to_curve.py if you want to log u, Q, R)

from hashlib import sha256
import py_ecc
from py_ecc.bls.hash_to_curve import hash_to_G2

def fq_to_hex(fq):
    h = hex(fq)[2:].rjust(96, '0')
    return '{}...{}'.format(h[:4], h[-4:])
def fq2_to_hex(fq2):
    a, b = fq2.coeffs
    return '{} + {}×i'.format(*map(fq_to_hex, fq2.coeffs))
def point_3d_to_2d(P):
    z3inv = (P[2] ** 3).inv()
    return (P[0] * P[2] * z3inv, P[1] * z3inv)
def ppp(p): # pretty print point
    x, y = point_3d_to_2d(p)
    return 'Point<x={}, y={}>'.format(fq2_to_hex(x), fq2_to_hex(y))

msg = b'abc'
dst = b'BLS12381G2_XMD:SHA-256_SSWU_RO_TESTGEN'
print(ppp(hash_to_G2(msg, dst, sha256)))

cc @hwwhww @CarlBeek

stackoverflow while computing paring() on curve bn128

  • OS: win10
  • Python Version (python --version):3.8.5
  • Environment (output of pip freeze):py-ecc==5.2.0

when compute paring(G2, G1), the process gives me "Process finished with exit code -1073741571 (0xC00000FD)", even i've set "threading.stack_size(200000000)", which doesn't work.
here's my test code:

import py_ecc.bn128.bn128_curve as curve
import py_ecc.bn128.bn128_pairing as pairing

def test():
Z = pairing.pairing(curve.G2, curve.G1)
print(Z)
return

if name == 'main':
threading.stack_size(200000000)
thread = threading.Thread(target=test())
thread.start()

a reasonable guess is that when computing pow()(which is in py_ecc/fields/field_properties.py) , the huge integer make the depth of stack too deep, which finally overflowed. but i still don't know how can i make it works.

Test and bugfix equality testing with different degree polynomials.

What is wrong?

If I read the current implementation of FQP.__eq__ correctly, it returns True for this test:

FQP([1, 2, 3], ...) == FQP([1, 2], ...)

which seems wrong.

In the current implementation, zip terminates on the shorter list, and there's no explicit length/degree check.

How can it be fixed

Add tests for related scenarios, and make sure implementation is correct.

Also, because of the type checking involved, make sure that the equality test is commutative when the class is different, like:

fq2_a = FQP([2, 3], (1, 0))
fq2_b = FQ2([2, 3], (1, 0))

assert fq2_a == fq2_b
assert fq2_b == fq2_a

Switch curve_name from string to NamedTuple

py_ecc/fields/__init__.py has a lot of duplications and is at high risk of typos (like importing the coeffs or field modulus from the wrong curve).

Let's think about alternatives to this setup. Maybe something like:

class Curve(NamedTuple):
  field_modulus: int
  fq12_modulus_coeffs: Tuple[int, ...]
  fq2_modulus_coeffs: Tuple[int, int]

bls12_381 = Curve(
  field_modulus = 21888242871839275222246405745257275088696311157297823662689037894645226208583,
  fq2_modulus_coeffs = (1, 0),
  fq12_modulus_coeffs = (2, 0, 0, 0, 0, 0, -2, 0, 0, 0, 0, 0),  # Implied + [1]
)

^ This still doesn't feel quite right, but it's a starting point.

The end goal is to be able to create bn128_FQ2 with a reference to a single thing (like this new NamedTuple)

Of course, this approach comes at the downside of having to import the curve to create an FQ. Maybe a little more annoying at the REPL. But for the caller in a file, it means fewer magic strings and less likelihood for a typo with a confusing error. For the library, hopefully better readability.

Originally posted by @carver in https://github.com/_render_node/MDI0OlB1bGxSZXF1ZXN0UmV2aWV3Q29tbWVudDI0OTIyNTg5NQ==/comments/review_comment

FQP initialization should take the mod of input coefficients

Full context: ethereum/consensus-specs#508

What is wrong?

For BN and BLS, FQP object should take the modulo of the input during initialization.

Test case:

from py_ecc.optimized_bls12_381 import FQ, FQ2

x = FQ(-1)
y = FQ2([-1,-1])

print(x) # 4002409555221667393417789825735904156556882819939007885332058136124031650490837864442687629129015664037894272559786
print(y) # (-1, -1)

This is used in modular_squareroot part of Eth2.0 spec (https://github.com/ethereum/eth2.0-specs/blob/928f9772fa51ed2e251b99c4795097da976c4bde/specs/bls_signature.md#modular_squareroot) and Trinity (https://github.com/ethereum/trinity/blob/a1b0f058e7bc8e385c8dac3164c49098967fd5bb/eth2/_utils/bls.py#L63-L77)

How can it be fixed

Take the mod of the inputs in FQP __init__

class FQP(object):
def __init__(self, coeffs, modulus_coeffs):
assert len(coeffs) == len(modulus_coeffs)
self.coeffs = tuple(coeffs)
# The coefficients of the modulus, without the leading [1]
self.modulus_coeffs = tuple(modulus_coeffs)
# The degree of the extension field
self.degree = len(self.modulus_coeffs)

Like it's done for FQ

class FQ(object):
def __init__(self, n):
if isinstance(n, self.__class__):
self.n = n.n
else:
self.n = n % field_modulus
assert isinstance(self.n, int)

Optimisations for Hash to Curve

What is wrong?

Currently we use a 'constant' time hash to curve function.

It is possible to significantly increase speeds by removing the constant time requirements.

How can it be fixed

The two key areas where speeds could be made are in py_ecc/optimized_bls12_381/optimized_swu.py:

  • Returning when a solution is found in sqrt_division_FQ2()
  • Stopping the loop in optimized_swu_G2() when either success or success_2 is true.

The Documentation of py_ecc

Hi~, I'm planning to use this library as simple ecc library for other usage, but I didn't find any documention of the api. The link mentioned in readme.md doesn't refer to the api or concrete function. Could you please give me some links that refers to introduction of the usage of this library? (Including explanation of some functions or demo usage)
Thanks

ReadTheDocs and doctests

What is wrong?

The project is currently only documented with small examples in the readme

How can it be fixed

Create sphinx based docs and host on RTD.

Current pinned mypy version is incompatible with python 3.8+

What is wrong?

Can't install dev dependencies of py_ecc with python3.8+ because the mypy version we're using requires a version of typed-ast that is incompatible with python 3.8. See issue: python/typed_ast#126

How can it be fixed

Workaround for local development for now is to use python 3.7.

I started down the path of updating to mypy v0.910, but there were too many typing errors that needed more time than I had today. The branch I started is here.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.