Skip to content

"What is RaptorQ?"

Ricardo Fonseca edited this page Nov 18, 2017 · 3 revisions

The following is a sequence of definitions, following a top-bottom approach and progressing in increasing order of specificity, that ends with the definition of RaptorQ, which is the code that our project implements (abbreviated in the last two letters of our project's name).


Forward Error Correction

Commonly known as FEC. A technique for the recovery of errors in data disseminated over unreliable or noisy communication channels. The central idea is that the sender encodes the message in a redundant way by applying an error-correcting code, which allows the receiver to repair the errors.

For more information visit the Wikipedia article on FEC.

FEC code

An algorithm used in Forward Error Correction that specifies how the encoding and decoding processes work. FEC codes do not dictate how data is transmitted from one place to another, but they may provide to a transmission protocol, encoded data in a specific format according to a FEC Scheme.

Erasure code

A FEC code with the capability to recover from losses in the communications. The data is divided into K source symbols (small blocks), which are transformed in a larger number of N encoding symbols such that the original data can be retrieved from a subset of the N encoding symbols. We call the N - K symbols the symbol overhead.

For more information visit the Wikipedia article on Erasure code.

Fountain code

A class of erasure codes with two important properties:

  • an arbitrary number of encoding symbols can be produced on the fly, simplifying the adaptation to varying loss rates;
  • the data can be reconstructed with high probability from any subset of the encoding symbols (of size equal to K symbols plus a small symbol overhead).

For more information visit the Wikipedia article on Fountain code.

RaptorQ code

Currently, the closest solution to an ideal digital fountain code and specified by the FEC Scheme described in RFC 6330. It has the capability of achieving constant per-symbol encoding/decoding cost with a symbol overhead near zero.

For example, from data partitioned into K source symbols, the RaptorQ decoder can:

  • with a set of any K encoding symbols, successfully decode the original data with 99% probability
  • with a set of any K + 1 encoding symbols, successfully decode the original data with 99.99% probability
  • with a set of any K + 2 encoding symbols, successfully decode the original data with 99.9999% probability (one in a million chance of failure)
Clone this wiki locally