Code Monkey home page Code Monkey logo

vc-di-eddsa's Introduction

EdDSA Data Integrity Cryptosuites

This specification describes a Data Integrity cryptographic suite for use when creating or verifying a digital signature using the twisted Edwards Curve Digital Signature Algorithm (EdDSA) and Curve25519 (ed25519). The approach is accepted by the U.S. National Institute of Standards in the latest FIPS 186-5 draft and, after ratification, is expected to meet U.S. Federal Information Processing requirements when using cryptography to secure digital information.

We encourage contributions meeting the Contribution Guidelines. While we prefer the creation of issues and Pull Requests in the GitHub repository, discussions also occur on the public-vc-wg mailing list.

Verifiable Credentials Working Group

Other Relevant Working Group Repositories

Discussion Forums

Process Overview for VC Data Model Pull Requests

  1. Anyone can open a PR on the repository. Note that for the PR to be merged, the individual must agree to the W3C Patent Policy.
  2. Once a PR is opened, it will be reviewed by the Editors and other WG Members.
  3. The W3C CCG is automatically notified of PRs as they are raised and discussed.
  4. PRs are usually merged in 7 days if there is adequate review and consensus, as determined by the Chairs and Editors.

Roadmap for 2022-2024

The VCWG has a set of deliverables and a timeline listed in the most recent VCWG charter

vc-di-eddsa's People

Contributors

clehner avatar davidlehn avatar dlongley avatar dmitrizagidulin avatar filip26 avatar iherman avatar jsassassin avatar kimdhamilton avatar matthieubosquet avatar msporny avatar or13 avatar peacekeeper avatar seabass-labrax avatar tallted avatar tplooker avatar wind4greg avatar wyc avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

vc-di-eddsa's Issues

Proof configuration and previousProof (maybe editorial)

Just checking.

§3.2.5 Proof Configuration does not mention the previousProof property, if applicable. I.e., when calculating the value of canonicalProofConfig, that value is not taken into consideration.

I do not know whether this is intentional or an omission.

If it is intentional, it might be worth emphasizing. The formulation in §3.2.2 Verify Proof, point (2) only mentions proofValue as the property to be removed, which gives the false impression that previousProof is fine. (I know that in §3.2.5 the properties to be used are listed explicitly, but then why bother with point (2) in §3.2.2 in the first place?)

(The ecdsa case is identical.)

Review multibase vs base58 key representation

@dlongley I recall you may have objected to the idea of publicKeyMultibase vs publicKeyBase58.

Would love to hear your thoughts.

My initial thoughts are that publicKeyMultibase might be easier to align with IPLD in the future.

eddsa-rdfc-2022 Transformation passes wrong type to "Deserialize JSON-LD"

https://www.w3.org/TR/vc-di-eddsa/#transformation-eddsa-rdfc-2022 takes a map that is not the result of the Node Map Generation algorithm and passes it to https://www.w3.org/TR/json-ld11-api/#deserialize-json-ld-to-rdf-algorithm, which expects "a map node map, which is the result of the Node Map Generation algorithm".

w3c/json-ld-api#579 reports that this type requirement is unclear, but the discussion there indicates that you're expected to do the extra step in callers.

Examples for Test Vector Generation

Hi all, it looks like we will have the following "Proof Representations" that need test vectors:

  1. DataIntegrityProof (type: "DataIntegrityProof", cryptosuite: "eddsa-2022")
  2. Ed25519Signature2020 (type: "Ed25519Signature2020"), this is "legacy" but well implemented
  3. json-eddsa-2022 (type: "DataIntegrityProof", cryptosuite: "json-eddsa-2022")

If we can agree on what the unsigned example document would look like I'll generate a set of test vectors (I'll put the code to do this a public directory and will try to keep the code as simple as possible. See for example this gist.

I was thinking of using the following unsigned example document:

{
    "@context": [
        "https://www.w3.org/ns/credentials/v2",
        {"UniversityDegreeCredential": "https://example.org/examples#UniversityDegreeCredential"}
    ],
    "id": "http://example.edu/credentials/3732",
    "type": [
        "VerifiableCredential",
        "UniversityDegreeCredential"
    ],
    "issuer": "https://example.edu/issuers/565049",
    "issuanceDate": "2010-01-01T00:00:00Z",
    "credentialSubject": {
        "id": "did:example:ebfeb1f712ebc6f1c276e12ec21",
        "degree": {
            "type": "BachelorDegree",
            "name": "Bachelor of Science and Arts"
        }
    }
}

Let me know any suggestions/changes to this example unsigned document.

Some examples have the wrong / old context version

jcs-eddsa-2022 review

The description of jcs-eddsa-2022 cryptosuite (3.2 jcs-eddsa-2022) refers to section 3.1.5 Proof Configuration (eddsa-2022) which contains:

  1. Set proofConfig.context to unsecuredDocument.context

Is this step necessary for jcs-eddsa-2022?

  1. Let canonicalProofConfig be the result of applying the Universal RDF Dataset Canonicalization Algorithm [RDF-CANON] to the proofConfig.

Shouldn't this step be replaced with JCS canonicalization?

(These steps were introduced in #31)

Also, there's an incorrect cryptosuite name in section 3.2 jcs-eddsa-2022, step 4. I made a branch with a fix silverpill#1 (couldn't make a pull request against this repo).

Security Considerations: Provable Security and Ed25519 Libraries

It turns out that after EdDSA Ed25519 signature adoption, rigorous proofs of its security properties were made, but some of these desirable security properties only apply if some relatively straight forward checks are enforced during verification.

However not all signature libraries for EdDSA Ed25519 perform these checks. Here is some text mostly from the papers explaining the security properties that can be proven and why they are desirable. In addition, the paper's authors provide test vectors on GitHub to help test signature libraries for some edge cases (only 12 tests in total). Might we want to capture those in this specification as an appendix? Below is some potential starting text. Opinions on how much guidance we should supply on this issue? For a more complete over view see the authors presentation this year to NIST.

Ed25519 Provable Security and Library Implementation Reality

Digital signatures may exhibit a number of desirable cryptographic properties among these are (using the definitions from reference 2 below):

EUF-CMA (existential unforgeability under chosen message attacks) is usually the minimal security property required of a signature scheme. It guarantees that any efficient adversary who has the public key $pk$ of the signer and received an arbitrary number of signatures on messages of its choice (in an adaptive manner): ${m_i, \sigma_i}{i=1}^N$, cannot output a valid signature $\sigma^∗$ for a new message $m^∗ \notin {m_i}{i=1}^N$ (except with negligible probability). In case the attacker outputs a valid signature on a new message: $(m^∗, \sigma^∗)$, it is called an existential forgery.

SUF-CMA (strong unforgeability under chosen message attacks) is a stronger notion than EUF-CMA. It guarantees that for any efficient adversary who has the public key $pk$ of the signer and received an arbitrary number of signatures on messages of its choice: ${mi, σi}{i=1}^N$, it cannot output a new valid signature pair $(m^∗, \sigma^∗)$, s.t. $(m^∗, \sigma^*) \notin {m^i, σ^i}{i=1}^N$ (except with negligible probability). Strong unforgeability implies that an adversary cannot only sign new messages, but also cannot find a new signature on an old message.

Binding signature (BS) We say that a signature scheme is binding if no efficient signer can output a tuple $[pk, m, m^\prime, \sigma]$, where both $(m, \sigma)$ and $(m^\prime, \sigma)$ are valid message signature pairs under the public key $pk$ and $m \ne m^\prime$ (except with negligible probability). A binding signature makes it impossible for the signer to claim later that it has signed a different message, the signature binds the signer to the message.

Strongly Binding signature (SBS) Certain applications may require a signature to not only be binding to the message but also be binding to the public key. We say that a signature scheme is strongly-binding if any efficient signer can not output a tuple $[pk, m, pk^\prime, m^\prime, \sigma]$, where $(m, \sigma)$ is a valid signature for the public key $pk$ and $(m^\prime, \sigma)$ is a valid signature for the public key $pk^\prime$ and either $m^\prime \ne m$ or $pk \ne pk^\prime$, or both (except with negligible probability).

It was recently proven in reference 1 and 2 that EdDSA with curve Ed25519 can posses all these desirable cryptographic properties but only if sufficient input validation testing is performed when verifying an Ed25519 based signature. In particular to a achieve the SUF property all validation tests checks specified in RFC8032 must be performed. In addition to achieve the SBS property public keys which are elliptic curve points of "small order" are invalid and must be rejected.

The validation checks of RFC8032 and the "small order point" checks have been incorporated into some Ed25519 implementations recently, however as shown in reference 2 many libraries have not implemented these validation checks. Since these checks are either part of established signature standard (RFC8032) or relatively minimal (there are only eight canonical small order points to check for) it is strongly recommended that implementers of this specification choose a signature library satisfying these conditions. Test vectors to test a signature library can be found at GitHub: ed25519-speccheck

References

  1. The Provable Security of Ed25519: Theory and Practice, 2020, Jacqueline Brendel, Cas Cremers, Dennis Jackson, and Mang Zhao.
  2. K. C. Garillot François and V. Nikolaenko, Taming the many EdDSAs, 2020 and accepted to SSR Conference October 2020.
  3. RFC8032: Edwards-Curve Digital Signature Algorithm (EdDSA)

Algorithm Completion, Fixes, and Enhancements, and Example contexts

While working through test vectors for the draft and receiving comments on those test vectors we came across some ambiguities, missing pieces, and mistakes (only one) in the Algorithms section of the draft. In addition, based on concerns about the @context field in the test vectors I did a review of the examples (non test vectors) in the draft and found some issues/questions/recommendations. I've summarized my findings below and depending on outcomes and issue pull requests and deemed necessary.

Restructure, enhance and fix the Algorithms section.

New outline that is more in line with the actual processing steps:

  • 3.1 Unsecured Document Transformation (rather than just "Transformation" to be clear)
  • 3.2 Proof Configuration Generation (rather than just "Proof Configuration" to be clear)
  • 3.3 Proof Configuration Transformation (new section on how to canonicalize proof configuration)
  • 3.4 Hashing (need to fix order of concatenated hashes, this is an error!, current document does not reflect deployed implementations)
  • 3.5 Proof Generation (enhance explanation with explanation of use of previous steps feeding into signature algorithm)
  • 3.6 Proof Verification (This is incomplete, it cites data integrity which does not furnish enough information. In particular does not cover recreating the proof configuration which is needed in verification. enhance with explanation of how to use previous steps 3.1-3.4 in and EdDSA algorithm in verification)

In the new section on "Proof Configuration Transformation" it would be good to explicitly address how @context should be handle. I would suggest:

  • In the (type: "DataIntegrityProof", cryptosuite: "eddsa-2022") and (type: "Ed25519Signature2020") cases, i.e., those using "RDF Dataset Canonicalization" that the @context field from the unsecured document is added as a whole to the proofConfig being created prior to canonicalization. Note: this may result in a bit of extra @context information being added to proofConfig that doesn't change the canonical value, but keeps this step simple.
  • In the (type: "DataIntegrityProof", cryptosuite: "json-eddsa-2022") case, i.e., using JCS RFC8785, that the @context field not be added to proofConfig since it use is not needed for JCS processing.

Regularize Context Usage

@context, in examples:

Hence can we simplify the above to just use the "https://www.w3.org/ns/credentials/v2" reference in the @context? If not can someone more expert in JSON-LD contexts write some non-normative text to aid developers trying to implement this specification.

@context in test vectors

The example for the unsecured document to be used for the generation of all test vectors is shown below. If it should be modified prior to regeneration of the test vectors let me know.

{
    "@context": [
        "https://www.w3.org/ns/credentials/v2",
        "https://www.w3.org/ns/credentials/examples/v2"
    ],
    "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33",
    "type": ["VerifiableCredential", "AlumniCredential"],
    "name": "Alumni Credential",
    "description": "A minimum viable example of an Alumni Credential.",
    "issuer": "https://vc.example/issuers/5678",
    "validFrom": "2023-01-01T00:00:00Z",
    "credentialSubject": {
        "id": "did:example:abcdefgh",
        "alumniOf": "The School of Examples"
    }
}

How is `proofConfig` meant to be serialized?

Part 2) of Section 3.2 Hashing defines proofConfigHash to be the hash of proofConfig, but it is not stated how proofConfig is intended to be serialized/canonicalized in such a way that proofConfigHash can be unambiguously interpreted/reproduced.

It would be much appreciated if this could be clarified.

Thank you!

Is the hashing formulation inconsistent?

§3.2.4 says, in bullet entry (3):

Let hashData be the result of joining proofConfigHash (the first hash) with transformedDocumentHash (the second hash).

However, proofConfigHash is defined in bullet item (2), and transformedDocumentHash is defined in bullet item (1). Ie, the item should either be:

Let hashData be the result of joining transformedDocumentHash (the first hash) with proofConfigHash (the second hash).

or

Let hashData be the result of joining proofConfigHash (the second hash) with transformedDocumentHash (the first hash).

Either way is fine, although the first possibility would make all current implementation invalid 😒.

Of course, a third solution would be to exchange bullet items (1) and (2).

(I did not check whether the same error occurs elsewhere in the same document as well as in other cryptosuites.)

Unify Error Handling

Similar to issue w3c/vc-di-ecdsa#63. To unify error handling language across this specification and other cryptosuite specifications I'd recommend:

  1. Use appropriate error handling language as in DI specification, e.g., "an error MUST be raised and SHOULD convey an error type of ERROR_CODE_NAME." where ERROR_CODE_NAME is defined in the DI specification.
  2. Use standardized error codes from DI Specification, and if needed add new codes to DI specification
  3. Check for non-rigorous error handling language and if it needs to be updated (errors without codes that need codes)

Below I show codes used but not in the DI specification. I didn't find any error conditions without codes that should have them. Thoughts/Opinions?

Error Codes Used but Not in DI Spec

Error codes: PROOF_TRANSFORMATION_ERROR, INVALID_PROOF_CONFIGURATION, INVALID_PROOF_DATETIME, MALFORMED_PROOF_ERROR,

  • line 635: "If options.type is not set to the string DataIntegrityProof and options.cryptosuite is not set to the string eddsa-rdfc-2022 then a PROOF_TRANSFORMATION_ERROR MUST be raised." In section Transformation (eddsa-rdfc-2022).
  • line 724: "If options.type is not set to DataIntegrityProof and proofConfig.cryptosuite is not set to eddsa-rdfc-2022, an INVALID_PROOF_CONFIGURATION error MUST be raised."
  • line 730: "If the value is not a valid [[XMLSCHEMA11-2]] datetime, an INVALID_PROOF_DATETIME error MUST be raised."
  • line 1005: "If options.type is not set to the string DataIntegrityProof and options.cryptosuite is not set to the string eddsa-jcs-2022 then an error MUST be raised that SHOULD use the MALFORMED_PROOF_ERROR error code."
  • line 1085: "If options.type is not set to DataIntegrityProof and proofConfig.cryptosuite is not set to eddsa-jcs-2022, an INVALID_PROOF_CONFIGURATION error MUST be raised."
  • line 1090: "Set proofConfig.created to options.created. If the value is not a valid [[XMLSCHEMA11-2]] datetime, an INVALID_PROOF_DATETIME error MUST be raised."
  • line 2017: "If options.type is not set to the string Ed25519Signature2020, then a PROOF_TRANSFORMATION_ERROR MUST be raised."
  • line 2114: "If options.type is not set to Ed25519Signature2020, an INVALID_PROOF_CONFIGURATION error MUST be raised."
  • line 2119: "If the value is not a valid [[XMLSCHEMA11-2]] datetime, an INVALID_PROOF_DATETIME error MUST be raised."

Test Vectors Don't Verify

When trying out the procedures of EdDSA Cryptosuite v2020 the test vectors would not verify. Also ran into some inconsistencies with the test vectors.

  • The private key "privateKeyBase58": "47QbyJEDqmHTzsdg8xzqXD8gqKuLufYRrKWTmB7eAaWHG2EAsQ2GUyqRqWWYT15dGuag52Sf3j4hs2mu7w52mgps" is a 64 byte quantity not a 32 byte quantity as needed for Ed25519.
  • The "seed_0": "9b937b81322d816cfab9d5a3baacc9b2a5febe4b149f126b3630f93a29527017" quantity corresponds to the private key for the public key given by "publicKeyBase58": "dbDmZLTWuEYYZNHFLKLoRkEX4sZykkSLNQLXvMUyMB1" so I used that in subsequent signature atemps.
  • Used https://www.npmjs.com/package/jsonld library to canonize the example VC.
  • Tried both a SHA256 prehash and direct into EdDSA signature algorithm (RFC8032 compliant) but was not able to obtain the hash given as: "proofValue": "z5SpZtDGGz5a89PJbQT2sgbRUiyyAGhhgjcf86aJHfYcfvPjxn6vej5na6kUzmw1jMAR9PJU9mowshQFFdGmDN14D",

What may be the issue?

Best Regards
Greg B.

Consider making verification method id to be multicodec (instead of just JWK thumbprint)

Currently:

The id of the verification method SHOULD be the JWK thumbprint calculated from the publicKeyMultibase property value according to [MULTIBASE].

It would be great if this SHOULD could be changed to be backwards compatible with how did:key method key IDs are formed (using a combination of multibase and multicodec) (this would also match common key id strategy of various ed25519-2018 suite implementations).

So, specifically:

MULTIBASE(base-type, MULTICODEC(public-key-type, raw-public-key-bytes))

This would require adding an entry to the multicodec table for "JWK Thumbprint".

That way, the current "JWK thumbprint" approach would be encoded as:

MULTIBASE(<base-type - base64url etc>, MULTICODEC(<jwk thumbprint codec type>, <thumbprint bytes>))

But also would enable did:key-like

MULTIBASE(base58-btc, MULTICODEC(public-key-type, raw-public-key-bytes))

@OR13 - would you be open to this approach? (Provided we can add 'JWK Thumbprint' as a multicodec entry?)

Test vector issue in B.1 Representation: eddsa-rdfc-2022

The Canonical Proof Options Document is based on v1 of the https://www.w3.org/ns/credentials/v2 context given in the Proof Options Document. When canonicalizing using v2, there is a single difference in the cryptosuite line.

When using v1

_:c14n0 https://w3id.org/security#cryptosuite "eddsa-rdfc-2022" .

When using v2

_:c14n0 https://w3id.org/security#cryptosuite "eddsa-rdfc-2022"^^https://w3id.org/security#cryptosuiteString .

This has the effect of invalidating the test vector, as the hash is not equivalent using the provided test document. The change results from the expanded cryptosuite entry added to v2 on August 31.

The test vector hash 2be46f84ea6a1f078d128fa0f234b4abb76d1ae1efac14d0e62557163a871621 can be reproduced by replacing the v2 context with v1's archived url: https://raw.githubusercontent.com/w3c/vc-data-integrity/54291182d0452517aeda9ceb114d46c4e1ddd364/contexts/data-integrity/v1.

Context (JSON-LD) for Examples and Test Vectors

Example Context Usage Questions

Hence can we simplify the above to just use the "https://www.w3.org/ns/credentials/v2" reference in the @context? If not can someone more expert in JSON-LD contexts write some non-normative text to aid developers trying to implement this specification.

Context for Test Vectors

The PR #29 has been held up for a bit due to differences on what should be in the @context field. Although this PR concerns an informative section, Test Vectors, a reasonably complete set of test vectors are important to the progress of any specification involving cryptography.

The example for the unsecured document used for the generation of all test vectors is shown below. If we can get agreement on how it should be modified to satisfy all interested parties I can quickly regenerate all the test vector examples and update the PR. Developers crave up to date test vectors to validate their implementations! Please help me progress this specification. Thanks!

{
    "@context": [
        "https://www.w3.org/ns/credentials/v2",
        "https://www.w3.org/ns/credentials/examples/v2"
    ],
    "id": "urn:uuid:58172aac-d8ba-11ed-83dd-0b3aef56cc33",
    "type": ["VerifiableCredential", "AlumniCredential"],
    "name": "Alumni Credential",
    "description": "A minimum viable example of an Alumni Credential.",
    "issuer": "https://vc.example/issuers/5678",
    "validFrom": "2023-01-01T00:00:00Z",
    "credentialSubject": {
        "id": "did:example:abcdefgh",
        "alumniOf": "The School of Examples"
    }
}

Review on 2023-01-26 by Greg Bernstein, Test Vector Suggestions

Review

  • Reference to Multiformats is out of date. New reference: https://datatracker.ietf.org/doc/html/draft-multiformats-multibase-06
  • On example 1 "An Ed25519 public key encoded as a Multikey", Should this example use the context field as shown in section 2.3.1.2 of Data Integrity specification? i.e., add @context": ["https://w3id.org/security/multikey/v1"],?
  • In example 2 "An Ed25519 public key encoded as a Multikey in a controller document" do we also need to add the additional item "https://w3id.org/security/multikey/v1" to the @context array?
  • Section 2.2.1 DataIntegrityProof. It says: "The proofValue property of the proof MUST be a detached EdDSA produced according to [RFC8032], encoded according to [MULTIBASE] using the base58-btc base encoding." Doesn't this need to be produced according to the "Algorithms" section (which includes RFC8032 as part of the procedure.) editorial would revise to: The proofValue property of the proof MUST be a detached EdDSA produced according to section 3 Algorithms"
  • Question for clarification. Are DataIntegrityProof and Ed25519Signature2020 two containers for the same information the former being more general? Is one to be preferred? Should something be said about this in the text?
  • Section 3.2 Hashing. Step 3 "Let hashData be the result of concatenating transformedDocumentHash and proofConfigHash." Using VC generated by the CHAPI Playground or Verifiable Credentials JS Library I verified sample credentials Example: Verifying Verifiable Credentials with the order of these to hashes reverse, i.e., in Python test_combined_hash = proof_hash + doc_hash. Am I misinterpreting the text or is this an error.
  • The current test vectors don't seem to verify and could use more elaboration. It would be nice to illustrate all the steps: private and public keys, raw document, options, canonized versions of both, hashes of both, raw signature, encoded signature.
  • I've furnished an example set of test vectors below in with proof type "Ed25519Signature2020".
  • Test vectors: do we want two different example chains? One corresponding to DataIntegrityProof and one to Ed25519Signature2020, i.e., to produce something like examples 5 and 6 as output.

Test Vector Suggestion

Key Information

Console logged the keypair info from the VC ReSpec plugin...

{
    publicKeyMultibase: "z6MkrJVnaZkeFzdQyMZu1cgjg7k1pZZ6pvBQ7XJPt4swbTQ2", 
    privateKeyMultibase: "zrv4fUrY27wFmySt7kSQ1yUqsobTkN8uPvAH1WB4sCJ4d7Q4yDpJNN3AVxQZybuM2txbXbWYCRDKZxenLmGz32Tp5bt" 
}

Notes: The private key above is actually two keys. We only want the first 32 bytes. Can show these in Multibase or hex format.

Credential with Proof

{
  "@context": [
    "https://www.w3.org/2018/credentials/v1",
    "https://www.w3.org/2018/credentials/examples/v1",
    "https://w3id.org/security/suites/ed25519-2020/v1"
  ],
  "id": "http://example.edu/credentials/3732",
  "type": [
    "VerifiableCredential",
    "UniversityDegreeCredential"
  ],
  "issuer": "https://example.edu/issuers/565049",
  "issuanceDate": "2010-01-01T00:00:00Z",
  "credentialSubject": {
    "id": "did:example:ebfeb1f712ebc6f1c276e12ec21",
    "degree": {
      "type": "BachelorDegree",
      "name": "Bachelor of Science and Arts"
    }
  },
  "proof": {
    "type": "Ed25519Signature2020",
    "created": "2022-12-07T21:31:08Z",
    "verificationMethod": "https://example.edu/issuers/565049#key-1",
    "proofPurpose": "assertionMethod",
    "proofValue": "z2RczMj342tVhAjgjEPV4TeHbi2ggnTRKTc5BFQCgaWJ3nhcg5HgCeC2eV4Lc1fYdhfoLyPjxoq4BtqrsyNvxZ8nE"
  }
}

Credential with Proof Removed

{
  "@context": [
    "https://www.w3.org/2018/credentials/v1",
    "https://www.w3.org/2018/credentials/examples/v1",
    "https://w3id.org/security/suites/ed25519-2020/v1"
  ],
  "id": "http://example.edu/credentials/3732",
  "type": [
    "VerifiableCredential",
    "UniversityDegreeCredential"
  ],
  "issuer": "https://example.edu/issuers/565049",
  "issuanceDate": "2010-01-01T00:00:00Z",
  "credentialSubject": {
    "id": "did:example:ebfeb1f712ebc6f1c276e12ec21",
    "degree": {
      "type": "BachelorDegree",
      "name": "Bachelor of Science and Arts"
    }
  }
}

Canonized Document without Proof

<did:example:ebfeb1f712ebc6f1c276e12ec21> <https://example.org/examples#degree> _:c14n0 .
<http://example.edu/credentials/3732> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://example.org/examples#UniversityDegreeCredential> .
<http://example.edu/credentials/3732> <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://www.w3.org/2018/credentials#VerifiableCredential> .
<http://example.edu/credentials/3732> <https://www.w3.org/2018/credentials#credentialSubject> <did:example:ebfeb1f712ebc6f1c276e12ec21> .
<http://example.edu/credentials/3732> <https://www.w3.org/2018/credentials#issuanceDate> "2010-01-01T00:00:00Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> .
<http://example.edu/credentials/3732> <https://www.w3.org/2018/credentials#issuer> <https://example.edu/issuers/565049> .
_:c14n0 <http://schema.org/name> "Bachelor of Science and Arts"^^<http://www.w3.org/1999/02/22-rdf-syntax-ns#HTML> .
_:c14n0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://example.org/examples#BachelorDegree> .

Hash of Canonized VC w/o Proof

As a hexadecimal string:

6c6b2795e7fa33a9fb28062527142b3c6edf7ba239942439b6f0bb0851b3cce3

Proof Options Document

{
  "type": "Ed25519Signature2020",
  "created": "2022-12-07T21:31:08Z",
  "verificationMethod": "https://example.edu/issuers/565049#key-1",
  "proofPurpose": "assertionMethod",
  "@context": [
    "https://www.w3.org/2018/credentials/v1",
    "https://www.w3.org/2018/credentials/examples/v1",
    "https://w3id.org/security/suites/ed25519-2020/v1"
  ]
}

Canonized Proof Options

_:c14n0 <http://purl.org/dc/terms/created> "2022-12-07T21:31:08Z"^^<http://www.w3.org/2001/XMLSchema#dateTime> .
_:c14n0 <http://www.w3.org/1999/02/22-rdf-syntax-ns#type> <https://w3id.org/security#Ed25519Signature2020> .
_:c14n0 <https://w3id.org/security#proofPurpose> <https://w3id.org/security#assertionMethod> .
_:c14n0 <https://w3id.org/security#verificationMethod> <https://example.edu/issuers/565049#key-1> .

Hash of Canonized Proof Options

As a hexadecimal string:

565a2884ebb2d38aa34871108074ab51631ec812d33eb2473178bce19937ad09

Concatenate and Sign with Private Key.

Concatenation of proof_hash then raw doc_hash (in this order):

565a2884ebb2d38aa34871108074ab51631ec812d33eb2473178bce19937ad096c6b2795e7fa33a9fb28062527142b3c6edf7ba239942439b6f0bb0851b3cce3

Signature in hexadecimal:

'473fb02a4aaf5863a2ef33f104bd55617e40907bc311e29e87278d15d7596f201639f41ec0e00db11159e9139f673d9257558e1f0134e1f67ac73f91ed89670b'

Signature Base58btc encoded:

'z2RczMj342tVhAjgjEPV4TeHbi2ggnTRKTc5BFQCgaWJ3nhcg5HgCeC2eV4Lc1fYdhfoLyPjxoq4BtqrsyNvxZ8nE'

Necessary adjustments to the security vocabulary

If my interpretation in #32 is correct, there are some adjustments to be made to the data integrity vocabulary document, namely:

  • A new class Multikey should be added, presumably as a subclass of VerificationMethod
  • A new property privateKeyMultibase should be added, as a counterpart to the (already existing) publicKeyMultibase property
  • The Ed255Signature2020 should be marked as deprecated (note: the Ed2559VerificationKey2020 class was already set to deprecated before, but that was artificial and simply reflected the fact that the class is not even mentioned in the Data Integrity spec)

Obviously, I am happy to create a relevant PR, but I wanted to check this first.

Is Example 15 / 16 correct?

I just tried to implement the Ed25519Signature2020 in python, and hit the following block, while trying to reproduce https://w3c.github.io/vc-di-eddsa/#representation-ed25519signature2020

Everything works fine until I sign the combined hash. Then I get a different result:

expectation = "473fb02a4aaf5863a2ef33f104bd55617e40907bc311e29e87278d15d7596f201639f41ec0e00db11159e9139f673d9257558e1f0134e1f67ac73f91ed89670b"
my_result = "9da48c66efed97e18281786cdd43e3fd245abbbdfebc9f9ab4cf0e974bb9a80bc3166ec9cf5f2a8e2f1c78a53682f9b7649f32285124b1ade5bd46b667b7620e"

I decode the private key via:

from multiformats import multibase, multicodec
from cryptography.hazmat.primitives.asymmetric import ed25519

  decoded = multibase.decode("z3u2en7t5LR2WtQH5PfFqMqwVHBeXouLzo6haApm8XHqvjxq")
  codec, key_bytes = multicodec.unwrap(decoded)
  assert codec.name == "ed25519-priv"
  private_key = ed25519.Ed25519PrivateKey.from_private_bytes(key_bytes)

and sign the combined hash via

signed_hex = private_key.sign(combined_hash.encode("ascii")).hex()

The base58btc encoding also turns out wrong.

Does someone spot where my mistake is? Or can quickly confirm example 15/16 are correct?

"signature" should be "proofValue" ?

In this example (and others):

    "proof": {
      "type": "Ed25519Signature2020",
      "created": "2019-12-11T03:50:55Z",
      "verificationMethod": "https://example.com/issuer/123#key-0",
      "proofPurpose": "authentication",
      "challenge": "123",
      "signature": "z5LgJQhEvrLoNqXSbBzFR6mqmBnUefxX6dBjn2A4FYmmtB3EcWC41RmvHARgHwZyuMkR9xMbMCY7Ch4iRr9R8o1JffWY63FRfX3em8f3avb1CU6FaxiMjZdNegc"
    }

Shouldn't it be using proofValue instead of signature ?

eddsa-jcs-2022 and nested documents

Should any additional steps be performed when nested documents are being secured?

Example:

{
  "@context": [
    "https://www.w3.org/ns/activitystreams",
    "https://w3id.org/security/data-integrity/v2"
  ],
  "type": "Create",
  "object": {
    "@context": [
      "https://www.w3.org/ns/activitystreams",
      "https://w3id.org/security/data-integrity/v2"
    ],
    "type": "Note",
    "content": "test",
    "proof": {
      "@context": [
        "https://www.w3.org/ns/activitystreams",
        "https://w3id.org/security/data-integrity/v2"
      ],
      "type": "DataIntegrityProof",
      "cryptosuite": "eddsa-jcs-2022"
    }
  },
  "proof": {
    "@context": [
      "https://www.w3.org/ns/activitystreams",
      "https://w3id.org/security/data-integrity/v2"
    ],
    "type": "DataIntegrityProof",
    "cryptosuite": "eddsa-jcs-2022"
  }
}

The document with type Note is secured first, then it is inserted into the document with type Create under the object property, then Create document is secured too. Here, I'm assuming that #79 will be merged. Some properties are omitted for brevity.

Is the resulting document valid?

Algorithm diff's vs. full listings

There is concern related to how many of the algorithms in the specification are written today. There exist the following things:

  1. Base algorithms in the vc-data-integrity spec.
  2. cryptosuite-specific modifications to those algorithms in the cryptosuite specs.
  3. When there is heavy overlap between the algorithms that two cryptosuites use, a diff between the base algorithm and the one that re-uses the base algorithm.

As @TallTed and @iherman have noted:

... this is sub-par for an implementer that is trying to piece together the algorithms when implementing a specific cryptosuite.

Remove the outdated portions...

It is the first time I look at this document and it made me realize (unless I miss something) that half of this document is outdated. The way I understand this document is that

  • in §2:
    • the Ed25519VerificationKey2020 is meant to be replaced by Multikey
    • the Ed25519Signature2020 proof format is meant to be replaced by DataIntegrityProof plus an extra value of "cryptosuite":"eddsa-2022"
  • in §3:
    • Ed25519Signature2020 meant to be replaced by eddsa-2022

In other words, editorially, the sections 2.1.2, 2.2.2, and 3.2 should be removed (and probably a new informative section should be added with some history, referring to those sections as deprecated).

This is, at least I read the text: these sections are, mostly, word-by-word identical, except with the appearance and usage of the cryptosuite property.

If I am right, I would also suggest that the title of the document is also a misnomer, and should either refer to 2022 (and not 2020) or remove the year reference altogether.

If all this is true, I believe this should be done asap. At this moment, for any reader who has not been part of the original development this document is extremely confusing, unclear; we should definitely not publish this document in its present form as a FPWD.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.