marblehe / sok Goto Github PK
View Code? Open in Web Editor NEWRepository for the SoK paper on Fully Homomorphic Encryption (FHE) compilers.
Repository for the SoK paper on Fully Homomorphic Encryption (FHE) compilers.
Both sets of plots should be generated by the Github Runner.
See https://github.com/homenc/HElib
it seems like they already provide a docker image
Benchmarking the libraries against each other:
We should fix two plaintext space sizes, e.g. 1-bit (binary) and 64-bit and use that for all benchmarks, just so that we don't blow up the space of possibilities too much.
For levelled schemes, we should evaluate with 1-mult params and with e.g. 10 levels.
For binary-only libraries like TFHE, the 64-bit version then uses an adder/mult circuit. In that case, Ctxt-Ptxt optimizations are possible but not implemente which is fine, since this isn't the main core of the paper anyway.
For fair comparison among implementations of the same scheme, we should hardcode the parameters to be equal everywhere.
Finally, we could consider implementing the benchmark applications (NN, cardio, Chi-Squared) for each library, but this out of scope for the current timeframe.
Add a metadata file (text file) that describes the environment and settings used for the benchmark run.
As there is information that can only be gathered in the container
This information is only available in the container and we need a script for each combination of (benchmark program, tool).
print_parameters(...)
in examples.h
This is information that is not available in the docker container but must be gathered on the EC2 VM.
The script collecting this information is usable for all benchmark programs.
git rev-parse HEAD
@AlexanderViand What do you think about adding the git commit SHA to the S3 folder name? Would this break something related to the visualization website?
Note to myself - this would require the following change to L43@benchmark.yml
:
$(echo $(date +'%Y%m%d_%H%M%S')_$(git rev-parse --short "$GITHUB_SHA"))
Similar as for the other tools, we need a small demo program that shows how writing a program for TFHE looks like and how it can be compiled.
Look into the last GA benchmark run and check why plotting failed, see the log here: https://github.com/MarbleHE/SoK/runs/1479985694?check_suite_focus=true
See reported issue. If no response received till July 30, send a mail to Fabian Boemer ([email protected]). Set Rosario Cammarota ([email protected]) and Casimir Wierzynski ([email protected]) in CC?
Mail snippets:
We are currently working on a survey of FHE compilers & optimisation techniques for an SoK-style paper on the topic ...
We're planning to benchmark and compare as many tools & techniques as possible, including some apples-to-oranges comparison between tools with very different focuses. We want to explore what types of applications benefit most from which kinds of optimisations, aiding developers in selecting tools based on their application needs, and maybe identify opportunities for synergies between tools.
We would love to include in our evaluation, and .... *however,
<we would be delighted to hear ...>
Add Palisade docker environment
https://gitlab.com/palisade/palisade-release
(note that the release is actually more current (10.0.5) than the development repo (10.0.4) as of 2020-11-24)
Some aspects for improving the current SoK benchmarking setup:
timeoutSeconds
argument to the timeout configured in the workflow--output-s3-bucket-name <value>
and --output-s3-key-prefix <value>
to write SSM output into S3 bucket (see https://docs.aws.amazon.com/cli/latest/reference/ssm/send-command.html) for easier debuggingShould be based on SEAL environment, since EVA is based on that
As the Chi-Squared test was implemented right before the submission deadline, its associated wiki page is not complete yet.
The CSV output schema used by cardio
in Cingulata currently looks like:
num_conditions, t_keygen, t_input_encryption, t_computation, t_decryption
15, 0.713642, 9.640915500, 51.604768300, 0.234077
TODO
s3://sok-repository-eval-benchmarks/<timestamp_folder>/<tool_folder>/<toolname_benchmark-program.csv>
where timestamp_folder
uses the format YYYYMMDD_HHMMSS
, tool_folder
is named according to the tool (e.g., Cingulata), and the benchmark's result file is composed of the tool's name and the benchmark program (e.g., cingulata_cardio.csv).
The Wiki here: https://github.com/MarbleHE/SoK/wiki
Attempts to embed these files:-
https://github.com/MarbleHE/SoK/wiki/docs/tableI.png
https://github.com/MarbleHE/SoK/wiki/docs/tableII.png
Both of which are broken images (HTTP/406)
Create a wiki page for the cardio
benchmark that contains:
See also #11
At minimum, do what we do in SEAL (i.e. FC/MLP)
We currently contrast manually-optimized vs compiler-optimized programs . Except for the NN task, this tends to result in the manual versions significantly outperforming the compiled versions, so maybe it'd be interesting so a non-optimized ("naive") manual implementation?
E.g. cardio-seal could use a ripple-carry-adder vs the Sklansky adder
One more option would be to never use in-place operations, to always relinearize directly after each mult (which we might do already anyway) and maybe even to always use ctxt-ctxt operations, to really simulate a "poor" implementation?
May I know what SoK stands for?
BR
Add Concrete library docker environment
Implement the cardio
benchmark program in SEAL.
cardio
).Similarly as for the other tools, we need a simple demo program and a dockerfile that serve as blueprint for implementing our benchmark programs.
Originally posted by @pjattke in #1 (comment)
To implement the NN-Inference application, the following needs to be done:
Some ideas that came up that could be interesting to add to the visualization website:
Add button to send repository_dispatch
event for triggering Github Action workflows
As an extension of suggestion one it would be nice to display the URL of the workflow status page after the button was pressed
The workflow is very simple and just sends the commands via AWS SSM to the EC2 instances. However, a successful workflow run does not guarantee that all benchmarks were executed successfully. Up to now, the easiest way is to check if the expected files were written to S3.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.