Skip to content

This is a project that uses PyTorch to classify dogs and cats. I built it from scratch and compared multiple models to see which one worked better

Notifications You must be signed in to change notification settings

HODUCVU/Classification-DogvsCat-using-PyTorch-from-Scratch

Repository files navigation

Classification-DogvsCat-using-PyTorch-from-Scratch

🔰 This is a project that uses PyTorch to classify dogs and cats. I built it from scratch and compared multiple models to see which one worked better

📎 I do this project to practice what I learned from this course PyTorch for Deep Learning & Machine Learning – Full Course with 30 hours - freecodecamp.org

Notebook - Train models

Open with Open In Colab

🖱️ If you don't want to train on Colab, well, just run the commands below:

User> pip install -r requirements.txt
User> python3 train_resnet_model.py

📉 Evaluating Train and Testing Processing

  • We can see that the results from the ResNet50 model are much better than the TinyVGG model, however, in the ResNet50 model, we can see that the model is overfitting. To solve this problem, we can reduce the number of layers in the ResNet50 model, in addition, we can experiment with some other optimizer types such as SGD or Adam to see the efficiency.

    TinyVGG model ResNet50 model

😾 Predict with TinyVGG model

Dog with TinyVGG model Cat with TinyVGG model

😾 Predict with ResNet50 model

Cat with ResNet50 model Dog with ResNet50 model

Evaluating models and deploy them

Notebook - Evaluating models and deploy them

Open with Open In Colab

📁 File Structure

deploy/
└── dogvscat_mini/
    ├── ResNet.pth
    ├── app.py
    ├── examples/
    │   ├── example_1.jpg
    │   ├── example_2.jpg
    │   └── example_3.jpg
    ├── model.py
    └── requirements.txt

Using git tokens to push on huggingface space

About

This is a project that uses PyTorch to classify dogs and cats. I built it from scratch and compared multiple models to see which one worked better

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published