Skip to content

abdelmageed95/Mastering-AI-with-Transformers-and-LLMs-For-NLP-Applications-Course

Repository files navigation

Mastering AI with Transformers and LLMs For NLP Applications Course

Mastering AI With Transformers and LLMs for NLP Applications isn't just a course; it's a transformative experience that arms learners with the expertise, practical skills, and innovation-driven mindset needed to navigate and lead in the ever-evolving landscape of Artificial Intelligence.

Course Overview

Embark on a transformative journey into the heart of Large Language Models (LLMs) with our comprehensive course. Dive deep into the intricacies of Transformers, unraveling their magic layer by layer, from the basics to building your own models and deploying them in production.

Course Content

Chapter 1: Introduction (Understanding Transformers)

  1. Explore Transformer's Pipeline Module - Notbook
  2. High-Level Understanding of Transformers Architecture
  3. What are Language Models

Chapter 2: Transformers Architecture

  1. Input Embedding - Notebook
  2. Positional Encoding
  3. The Encoder
  4. The Decoder
  5. Autoencoding LM - BERT
  6. Autoregressive LM - GPT
  7. Sequence2Sequence LM - T5
  8. Tokenization - Notebook

Chapter 3: Text Classification

  1. Fine-tuning BERT for Sentiment Analysis - Notebook
  2. Fine-tuning BERT for Multi-Class Classification - Notebook
  3. Fine-tuning BERT for Sentence-Pairs - Notebook

Chapter 4: Question Answering

  1. QA Intuition
  2. Build a QA System based on Amazon Reviews - Notebook
  3. Implement Retriever Reader Approach - Notebook
  4. Fine Tuning Transformers for Question Answering Systems - Notebook
  5. Table QA - Notebook

Chapter 5: Text Generation

  1. Introduction to Text Generation
  2. Greedy Search Decoding - Notebook
  3. Beam Search Decoding - Notebook
  4. Sampling Methods - Notebook
  5. Train Your Own GPT - Notebook

Chapter 6: Text Summarization

  1. Introduction to GPT2, T5, BART, PEGASUS
  2. Evaluation Metrics - Bleu Score, ROUGE
  3. Fine-Tuning PEGASUS for Dialogue Summarization - Notebook

Chapter 7: Build Your Own Transformer From Scratch

  1. Build Custom Tokenizer - Notebook
  2. Getting Your Data Ready - Notebook
  3. Implement Positional Embedding - Notebook
  4. Implement Transformer Architecture - Notebook

Chapter 8: Deploy Transformers Model in Production Environment

  1. Model Optimization with Knowledge Distillation - Notebook
  2. Model Optimization with Quantization - Notebook
  3. Model Optimization with ONNX and the ONNX Runtime - Notebook
  4. Serving Transformers with Fast API, Dockerizing Your Transformers APIs - Project

Requirements

  • Basic Python Programming
  • Fundamental Machine Learning Knowledge
  • NLP Basics (Optional)
  • Python and Jupyter Notebooks

Why Enroll?

  • Comprehensive Learning: From theory to practical application and deployment.
  • Practical Knowledge: Transform theoretical concepts into practical skills.
  • Versatility: Harness Transformer powers for diverse tasks.
  • Innovation: Become the architect of AI innovation.
  • Deployment Mastery: Seamlessly deploy models in production environments.
  • Real-World Relevance: Skills directly applicable to real-world scenarios.

Target Audience

  • AI Enthusiasts and Beginners
  • Data Scientists and Machine Learning Engineers
  • Developers and Programmers
  • NLP Enthusiasts
  • Researchers and Academics
  • AI Innovators and Entrepreneurs

Coming Soon!

  • AI For Audios Course
  • Transformers for Vision Course

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published