Skip to content

INFO 527 — Neural Networks: Assignment 2 (Backpropagation). Implementation of a feedforward neural network with sigmoid activation and training via backpropagation using gradient descent. Includes gradient computation, prediction, and pytest validation. Part of the Master’s in MIS/ML program at the University of Arizona.

Notifications You must be signed in to change notification settings

JDede1/info527-neural-networks-assignment2

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 

Repository files navigation

INFO 527 — Neural Networks

Assignment 2: Backpropagation

Objectives

The objectives of this assignment are to:

  1. Implement feed-forward prediction for a simple neural network.
  2. Implement training via backpropagation for a simple neural network.

This assignment builds on the concepts from Assignment 1 (String-to-Vectors) and focuses on understanding how neural networks learn through gradient-based optimization.


Files in This Repository

File Description
notebooks/nn.ipynb Main notebook containing the implementation of the feedforward and backpropagation methods.
notebooks/nn.py Exported Python version of the notebook for pytest testing.
notebooks/back_propagation.ipynb Notebook for running tests and recording pytest results.
results/nn.html Exported HTML version of the notebook for grading and readability.
tests/test_nn.py Provided test file used to validate correctness using pytest.
README.md This file, describing objectives, setup, and submission instructions.
requirements.txt Python dependencies for reproducibility.
.gitignore Specifies intentionally untracked files to keep the repository clean.

Class Overview: SimpleNetwork

The SimpleNetwork class implements a basic feedforward neural network where all units use a sigmoid activation function.
It supports both forward propagation and training via backpropagation using gradient descent.

Implemented Methods:

  1. __init__(self, *layer_weights) – Initializes the network with a list of weight matrices.
  2. predict(self, input_matrix) – Performs forward propagation and returns continuous outputs.
  3. predict_zero_one(self, input_matrix) – Converts outputs to binary predictions (0 or 1).
  4. gradients(self, input_matrix, output_matrix) – Computes gradients for each weight matrix via backpropagation.
  5. train(self, input_matrix, output_matrix, iterations, learning_rate) – Updates weights using gradient descent.

Environment Setup

This project reuses the shared environment from previous assignments.

# Activate your existing environment
source ~/venvs/ml-env/bin/activate

# Navigate to the assignment directory
cd ~/projects/info527-neural-networks-assignment2

# Install dependencies
pip install -r requirements.txt

requirements.txt

numpy>=1.24
pytest>=7.0
jupyter>=1.0
ipykernel>=6.0

Testing Instructions

  1. Run all cells in nn.ipynb to complete the implementation.

  2. Export the notebook as:

    • nn.ipynb
    • nn.html
    • nn.py
  3. Place the exported nn.py file in the same directory as test_nn.py.

  4. Run pytest to verify correctness:

pytest tests/test_nn.py

Expected Output:

============================= test session starts ==============================
collected 5 items

test_nn.py .....                                                         [100%]

============================== 5 passed in 0.47s ===============================

Submission Files

Submit the following for grading:

  • nn.ipynb
  • nn.html
  • nn.py
  • back_propagation.ipynb (with pytest results)

Grading Criteria

Grading is based solely on the correctness of the code written in nn.ipynb.

Method Description Marks
__init__ Network initialization 2
forward_propagation Compute layer activations 3
predict Forward prediction 2
predict_zero_one Binary conversion 2
gradients Compute backpropagation gradients 3
train Gradient descent weight updates 3
Total 15 marks

Repository Information

Repository name: info527-neural-networks-assignment2 Description: Implementation of a simple feedforward neural network trained via backpropagation. Includes gradient computation and model updates using NumPy and pytest validation. Part of the Master’s in MIS/ML program at the University of Arizona.


Author

This repository was completed as part of INFO 527: Neural Networks, under the M.S. in Information Science and Machine Learning program (2023–2025).

About

INFO 527 — Neural Networks: Assignment 2 (Backpropagation). Implementation of a feedforward neural network with sigmoid activation and training via backpropagation using gradient descent. Includes gradient computation, prediction, and pytest validation. Part of the Master’s in MIS/ML program at the University of Arizona.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published