ELEC 4531/5531: Introduction to Deep Learning: Building with PyTorch

🎓 Welcome to ELEC 4531/5531 Introduction to Deep Learning: Building with PyTorch! This course introduces the fundamentals of deep learning through hands-on projects using PyTorch. Students will learn to build and train neural networks, explore key architectures like CNNs and Transformers, and apply them to real-world tasks. By the end, students will be able to apply and experiment with advanced neural network architectures.

📚 Syllabus:

Week Topic Assignments (Tentative)
1

Introduction to Deep Learning & Course Setup

  •  + Overview of machine learning and deep learning
  •  + Course logistics, tools (Google Colab, AWS, MS Copilot, HuggingFace)
  •  + Setting up Python environment and Jupyter notebooks
2

Python for Machine Learning

  •  + Numpy, Pandas, Matplotlib, PIL
  •  + Data manipulation and visualization
  •  + Basics of image processing
3

Intro to PyTorch

  •  + PyTorch installation and setup
  •  + Tensors: creation, indexing, reshaping
  •  + Basic operations: matrix multiplication, broadcasting
  •  + Autograd and simple gradient tracking
HW1
4

Intro to TensorFlow

  •  + TensorFlow setup and workflow
  •  + Tensors and operations
  •  + Building simple models and training loops
5

Neural Networks Fundamentals

  •  + Activation functions, loss functions
  •  + Feedforward neural networks (FNNs)
  •  + Backpropagation
6

Convolutional Neural Networks (CNNs)

  •  + Convolution layer
  •  + Pooling, batch normalization, data augmentation
7

Convolutional Neural Networks (cont.)

  •  + Common architectures: LeNet, AlexNet
8

Midterm

9

Object Recognition and Detection

  •  + Introduction to object recognition and detection
  •  + Classical methods vs. deep learning approaches
  •  + Implementing object detection algorithms (YOLO)
  •  + Using Cloud Service for training (tentative)
HW2
10

Spring Break

11

Natural Language Processing (NLP)

  •  + Tokenization, embeddings, sequence modeling
  •  + Building NLP models in PyTorch
  •  + Sentiment analysis and text classification
12

Transformers & Attention Mechanisms

  •  + Limitations of RNNs and Seq2Seq models
  •  + Additive vs multiplicative attention; context vectors
  •  + Implementing attention and Seq2Seq models in PyTorch
HW3
13

Transformers & Attention Mechanisms (cont.)

  •  + Self-attention and positional encoding
  •  + Transformer architecture (encoder-decoder)
  •  + Applications in NLP and vision
14
  • Review Project
15
  • Demo Presentation