Skip to content

Educational implementation of a neural network including backprop using numpy with a PyTorch-like API

License

Notifications You must be signed in to change notification settings

iv461/nn_from_scratch

Repository files navigation

Neural networks with backprop from scratch

Educational implementation of neural networks and backprop from scratch using numpy. Why ? Learning how backprop works and how/why the PyTorch API is designed.

Features

  • Linear/Dense Layers
  • ReLU, Sigmoid, TanH activation functions
  • MSELoss
  • Backprop/autograd for up to 2d-arrays
  • Uniform weights initializer
  • Plotting of computational graphs including partial derivatives

Examples

function_approximation.py

Visualization of Compuational graph

Usage

model = Sequential([
        Linear(in_features=10, out_features=30),
        ReLu(),
        Linear(in_features=30,
               out_features=30),
        ReLu(),
        Linear(in_features=30, out_features=1),
    ])
x_input = Tensor(np.arange(10))
y_output = model.forward(x_input)
y_output.backward()

Install

Install this package including it's dependencies with:

pip install -e .

Installing graphviz on Windows: As there is no proper installer for graphviz, on Windows you have to download it, unzip it anywhere and add the bin subfolder to your PATH to get the graph drawing to work.

Run example

cd examples
python function_approximation.py

Run tests

To run the tests, pytorch is required as dependency to suit as a reference.

python -m unittest

License

This software is licensed under MIT.

Reference:

Similar projects

About

Educational implementation of a neural network including backprop using numpy with a PyTorch-like API

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages