Skip to content

Fixed typos in tutorial #11

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions docs/getting_started.rst
Original file line number Diff line number Diff line change
Expand Up @@ -29,11 +29,11 @@ In this example, the file :samp:`main.py` contains a function :samp:`my_task` th
def my_task(ctx: mlxp.Context)->None:

# Displaying user-defined options from './configs/config.yaml
print("ctx.config")
print(ctx.config)

# Logging information in log directory created by MLXP: (here "./logs/1" )
for i in range(ctx.config.num_epoch)
ctx.logger.log_metrics({"epoch":i})
for i in range(ctx.config.num_epoch):
ctx.logger.log_metrics({"epoch":i}, log_name="Quickstart")



Expand Down
12 changes: 6 additions & 6 deletions docs/tutorial_introduction.rst
Original file line number Diff line number Diff line change
Expand Up @@ -19,22 +19,22 @@ The first step is to create a directory 'tutorial' containing the code needed fo
tutorial/
├── configs/
│ └── config.yaml
├── core.py
├── core_app.py
├── main.py
└── read.py

The directory contains three files: :samp:`core.py`, :samp:`main.py` :samp:`results.py`. It also contains a directory :samp:`configs` that will be used later by MLXP. For now, we will only have a look at the :samp:`core.py` and :samp:`main.py` files.
The directory contains three files: :samp:`core_app.py`, :samp:`main.py` :samp:`read.py`. It also contains a directory :samp:`configs` that will be used later by MLXP. For now, we will only have a look at the :samp:`core_app.py` and :samp:`main.py` files.


The :samp:`core.py` file
The :samp:`core_app.py` file
""""""""""""""""""""""""

The file :samp:`core.py` contains a PyTorch implementation of a one hidden layer network :samp:`OneHiddenLayer` as well as a simple data loader :samp:`DataLoader` that we will use during training.
In the rest of the tutorial, we will not need to worry about the content of :samp:`core.py`, but let's just have a quick look at this file:
The file :samp:`core_app.py` contains a PyTorch implementation of a one hidden layer network :samp:`OneHiddenLayer` as well as a simple data loader :samp:`DataLoader` that we will use during training.
In the rest of the tutorial, we will not need to worry about the content of :samp:`core_app.py`, but let's just have a quick look at this file:


.. code-block:: python
:caption: main.py
:caption: core_app.py

import torch
import torch.nn as nn
Expand Down
2 changes: 1 addition & 1 deletion tutorial/README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ MLXP Tutorial
This page presents the main functionalities of MLXP through a tutorial.
See the following page for more detailed information:

- `Tutorial <https://inria-thoth.github.io/mlxp/tutorial.html>`__
- `Tutorial <https://inria-thoth.github.io/mlxp/pages/master/tutorial.html>`__

Requirements
------------
Expand Down
4 changes: 2 additions & 2 deletions tutorial/core_app.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,8 @@ def __init__(self, d_int, device, normalize=False):
self.X = torch.normal(mean= torch.zeros(N_samples,d_int,dtype=dtype,device=device),std=1.)

if normalize:
inv_norm = 1./tr.norm(self.X,dim=1)
self.X = tr.einsum('nd,n->nd',self.X,inv_norm)
inv_norm = 1./torch.norm(self.X,dim=1)
self.X = torch.einsum('nd,n->nd',self.X,inv_norm)

self.total_size = N_samples

Expand Down