Skip to content
This repository has been archived by the owner on Aug 30, 2022. It is now read-only.

XP-407 update documentation #239

Merged
merged 2 commits into from
Jan 24, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ $ make show
To run the Coordinator on your local machine, use the command:

```shell
$ python xain_fl/cli.py -f test_array.npy
$ python xain_fl/cli.py --storage-endpoint <url_for_storage> --storage-key-id <id_for_storage> --storage-secret-access-key <password_for_storage> --storage-bucket <name_of_storage_bucket>
```

For more information about the CLI and its arguments, run:
Expand Down
10 changes: 5 additions & 5 deletions docs/network_architecture.rst
Original file line number Diff line number Diff line change
Expand Up @@ -127,8 +127,8 @@ the server but the opposite is not true.
gRPC does use mechanisms from the underlying HTTP and TCP transport layers but
these are internal details that aren't really exposed in the API. A developer
can override the default timeouts but it's not clear from the available
documentation the effect they have. For more information check [using gRPC in
production](https://cs.mcgill.ca/~mxia3/2019/02/23/Using-gRPC-in-Production/).
documentation the effect they have. For more information check `using gRPC in
production <https://cs.mcgill.ca/~mxia3/2019/02/23/Using-gRPC-in-Production/>`_.

*Server-side timeouts configuration:*

Expand Down Expand Up @@ -377,9 +377,9 @@ The communication is summarised in the following sequence diagram.
In a training round, :math:`C` is in the state :code:`ROUND`. The selected
participant :math:`P` is in :code:`TRAINING` state. The first message is by
:math:`P` essentially kicks off the exchange. :math:`C` responds with the global
model :math:`\weights` (and other data as specified in
model :math:`weights` (and other data as specified in
:code:`StartTrainingRoundResponse`). Then :math:`P` carries out the training locally.
When complete, it sends the updated model :math:`\weights'` (and other metadata)
When complete, it sends the updated model :math:`weights'` (and other metadata)
back. :math:`C` responds with an acknowledgement.

.. image:: _static/sequence2.png
Expand Down Expand Up @@ -438,7 +438,7 @@ After a successful rendezvous, :math:`P` is in **Wait for Selection**. :math:`P`
this state as long as it keeps receiving :code:`STANDBY` heartbeats. At some round
:math:`i`, :math:`C` may select :math:`P` for the round by responding with a :code:`ROUND` :math:`i`
heartbeat. At this point, :math:`P` moves to **Training** where the above sequence of
training messages (:code:`StartTrainingRound` :math:`\rightarrow \weights \rightarrow \weights'
training messages (:code:`StartTrainingRound` :math:`\rightarrow weights \rightarrow weights'
\rightarrow` :code:`EndTrainingRound`) occur. Having received the :code:`EndTrainingRound` response from
:math:`C`, :math:`P` makes an "internal" transition to **Post-training** where it waits
until the start of the next round. If it has been selected again, it will
Expand Down