This repository demonstrates the implementation of various deep learning models using PyTorch, focusing on regression and classification tasks.
This project implements three main types of models:
- Linear Regression
- Binary Classification (Moon Dataset)
- Multi-class Classification (Spiral Dataset)
A simple linear regression model implementing y = Wx + b
- Loss Function: L1Loss (Mean Absolute Error)
L1(y, ŷ) = |y - ŷ|
- Optimizer: Stochastic Gradient Descent (SGD)
w = w - learning_rate * gradient
- Learning Rate: 0.01
- Epochs: 300
Implementation of binary classification using a neural network on the make_moons dataset.
Sequential(
Linear(2 → 10)
ReLU()
Linear(10 → 10)
ReLU()
Linear(10 → 10)
ReLU()
Linear(10 → 1)
)
- Loss Function: Binary Cross Entropy with Logits
BCE(x, y) = -[y * log(σ(x)) + (1 - y) * log(1 - σ(x))]
- Optimizer: SGD
- Learning Rate: 0.1
- Epochs: 1000
Implementation of multi-class classification on a spiral dataset.
Sequential(
Linear(2 → 10)
ReLU()
Linear(10 → 10)
ReLU()
Linear(10 → 10)
ReLU()
Linear(10 → 3)
)
- Loss Function: Cross Entropy Loss
CE(x, y) = -Σ y_i * log(softmax(x_i))
- Optimizer: Adam
m_t = β_1 * m_{t-1} + (1 - β_1) * g_t v_t = β_2 * v_{t-1} + (1 - β_2) * g_t^2
- Learning Rate: 0.1
- Epochs: 200
- Python 3.8+
- PyTorch
- scikit-learn
- matplotlib
- numpy
git clone https://github.com/yourusername/pytorch-regression-classification.git
cd pytorch-regression-classification
pip install -r requirements.txt
jupyter notebook notebook.ipynb
- Implementation of three different types of neural networks
- Visualization of decision boundaries
- Loss curve tracking and visualization
- Model performance analysis
- Comprehensive documentation
- PyTorch - Deep Learning Framework
- scikit-learn - Dataset Generation
- Matplotlib - Visualization
- NumPy - Numerical Computations
This project is licensed under the MIT License - see the LICENSE.md file for details
Contributions, issues, and feature requests are welcome! Feel free to check issues page.