Educational implementation of neural networks and backprop from scratch using numpy. Why ? Learning how backprop works and how/why the PyTorch API is designed.
- Linear/Dense Layers
- ReLU, Sigmoid, TanH activation functions
- MSELoss
- Backprop/autograd for up to 2d-arrays
- Uniform weights initializer
- Plotting of computational graphs including partial derivatives
Visualization of Compuational graph
model = Sequential([
Linear(in_features=10, out_features=30),
ReLu(),
Linear(in_features=30,
out_features=30),
ReLu(),
Linear(in_features=30, out_features=1),
])
x_input = Tensor(np.arange(10))
y_output = model.forward(x_input)
y_output.backward()
Install this package including it's dependencies with:
pip install -e .
Installing graphviz on Windows: As there is no proper installer for graphviz, on Windows you have to download it, unzip it anywhere and add the bin subfolder to your PATH to get the graph drawing to work.
cd examples
python function_approximation.py
To run the tests, pytorch is required as dependency to suit as a reference.
python -m unittest
This software is licensed under MIT.