To start developing and contributing to HoneyBadgerMPC:
Fork the HoneyBadgerMPC repository.
Clone your fork:
$ git clone --branch dev git@github.com:<username>/HoneyBadgerMPC.git
Add the remote repository initc3/HoneyBadgerMPC:
$ git remote add upstream git@github.com:initc3/HoneyBadgerMPC.git
Note
The remote name upstream
is just a convention and you are free
to name your remotes whatever you like.
See :ref:`git-remotes` for more information about remotes.
Install pre-commit to use a pre-commit hook to automate formatting the code using black:
$ pip3 install --user pre-commit $ pre-commit install pre-commit installed at .git/hooks/pre-commit
See https://pre-commit.com/#install for other ways to install pre-commit.
Next step: :ref:`setup a development environment <devenv>`.
You are free to manage your development environment the way you prefer. Two possible approaches are documented:
Using docker-compose
has the advantage that you do not need to manage
dependencies as everything is taking care of in the Dockerfile
.
You are encouraged to consult the Your Development Environment section in the The Hitchhiker’s Guide to Python for tips and tricks about text editors, IDEs, and interpreter tools.
Install Docker. (For Linux, see Manage Docker as a non-root user) to run
docker
withoutsudo
.)Install docker-compose.
Run the tests (the first time will take longer as the image will be built):
$ docker-compose run --rm honeybadgermpc
The tests should pass, and you should also see a small code coverage report output to the terminal.
If the above went all well, you should be setup for developing HoneyBadgerMPC!
Tip
You may find it useful when developing to have the following 3 "windows" opened at all times:
- your text editor or IDE
- an
ipython
session for quickly trying things out - a shell session for running tests, debugging, and building the docs
You can run the ipython
and shell session in separate containers:
IPython session:
$ docker-compose run --rm honeybadgermpc ipython
Shell session:
$ docker-compose run --rm honeybadgermpc bash
Once in the session (container) you can execute commands just as you would in a non-container session.
Running a specific test in a container (shell session)
As an example, to run the tests for passive.py
, which will generate and
open 1000 zero-sharings, N=3 t=2 (so no fault tolerance):
Run a shell session in a container:
$ docker-compose run --rm honeybadgermpc bash
Run the test:
$ pytest -vs tests/test_mpc.py
or
$ python -m honeybadgermpc.mpc
About code changes and building the image
When developing, you should not need to rebuild the image nor exit running
containers, unless new dependencies were added via the Dockerfile
. Hence
you can modify the code, add breakpoints, add new Python modules (files), and
the modifications will be readily available withing the running containers.
Install the GMP, MPC and MPFR development packages:
.. tabs:: .. tab:: Debian .. code-block:: shell-session $ apt install libgmp-dev libmpc-dev libmpfr-dev .. tab:: Fedora .. code-block:: shell-session $ dnf install gmp-devel libmpc-devel mpfr-devel .. tab:: Mac OS X .. code-block:: shell-session $ brew install gmp libmpc mpfr .. tab:: Windows Should not be needed as `pre-compiled versions <https://pypi.org/project/gmpy2/#files>`_ of ``gmpy2`` are available on PyPI. See `gmpy2 docs for Windows`_ for more information.
Install
honeybadgermpc
in editable mode for development:$ cd HoneyBadgerMPC/ $ pipenv install -e .[dev]
Activate a virtualenv:
$ pipenv shell
Run the tests to check that you are well setup:
$ pytest -v --cov
The tests should pass, and you should also see a small code coverage report output to the terminal.
The tests for honeybadgermpc
are located under the :file:`tests/`
directory and can be run with pytest:
$ pytest
Running in verbose mode:
$ pytest -v
Running a specific test:
$ pytest -v tests/test_mpc.py::test_open_shares
When debugging, i.e. if one has put breakpoints in the code, use the -s
option (or its equivalent --capture=no
):
$ pytest -v -s
# or
$ pytest -v --capture=no
To exit instantly on first error or failed test:
$ pytest -x
To re-run only the tests that failed in the last run:
$ pytest --lf
See pytest --help
for more options or the pytest docs.
Measuring the code coverage:
$ pytest --cov
Generating an html coverage report:
$ pytest --cov --cov-report html
View the report:
$ firefox htmlcov/index.html
Configuration for code coverage is located under the file :file:`.coveragerc`.
Code coverage tools
The code coverage is measured using the pytest-cov plugin which is based on coverage.py. The documentation of both projects is important when working on code coverage related issues. As an example, documentation for configuration can be first found in pytest-cov configuration but details about the coverage config file need to be looked up in coverage.py configuration docs.
In order to keep a minimal level of "code quality" flake8 is used. To run the check:
$ flake8
Configuration for flake8 is under the :file:`.flake8` file.
Documentation for honeybadgermpc
is located under the :file:`docs/`
directory. Sphinx is used to build the documentation, which is written
using the markup language reStructuredText.
The :file:`Makefile` can be used to build and serve the docs.
The make
targets to build the documentation do not update or rebuild
the docker image (honeybadgermpc-local
) being used, so make sure you have
an up-to-date image.
To check whether the honeybadgermpc-local
image was recently created:
$ docker images honeybadgermpc-local
REPOSITORY TAG IMAGE ID CREATED SIZE
honeybadgermpc-local latest 628fdc4f0200 18 minutes ago 2.58GB
To (re)build it:
$ docker-compose build
$ make servedocs
This will build the docs and open a tab or window in your default web browser at http://localhost:58888/.
When you make and save changes to .rst
files the documentation will be
rebuilt automatically. You should see the output in the terminal where you
ran make servedocs
.
Note
The automatic documentation generation uses watchdog. You can look at the docs.yml file to understand better how it works.
If you prefer you can run the automatic documentation generation in the background with:
$ make servedocs-detach
To monitor the output of the documentation generation you can follow the logs like so:
$ make docs-follow-logs
To simply get a dump of the latest logs:
$ make docs-logs
To stop serving and watching the docs:
$ make servedocs-stop
$ make docs
You then have to go to http://localhost:58888/ in a web browser.
To build the docs and have the browser automatically launch at http://localhost:58888/ run:
$ make docs-browser
There are many other ways to generate the documentation. The Makefile
targets and docker-compose
docs.yml
file are provided for
convenience.
If you prefer not to use the Makefile
and/or the docker-compose
docs.yml
file, then you can use the :file:`Makefile`, provided by
Sphinx, under the :file:`docs/` directory:
$ make -C docs html
or
$ cd docs
$ make html
The :file:`Makefile` makes use of the sphinx-build command, which one can also use directly:
$ sphinx-build -M html docs docs/_build -c docs -W --keep-going
It is possible to set some Sphinx environment variables when using the
:file:`Makefile`, and more particularly SPHINXOPTS
via the shortcut O
.
For instance, to treat warnings as errors and to keep going with
building the docs when a warning occurs:
$ O='-W --keep-going' make html
By default the generated docs are under :file:`docs/_build/html/` and one can view them using a browser, e.g.:
$ firefox docs/_build/html/index.html