The objectives of this assignment are to:
- Implement feed-forward prediction for a simple neural network.
- Implement training via backpropagation for a simple neural network.
This assignment builds on the concepts from Assignment 1 (String-to-Vectors) and focuses on understanding how neural networks learn through gradient-based optimization.
| File | Description |
|---|---|
notebooks/nn.ipynb |
Main notebook containing the implementation of the feedforward and backpropagation methods. |
notebooks/nn.py |
Exported Python version of the notebook for pytest testing. |
notebooks/back_propagation.ipynb |
Notebook for running tests and recording pytest results. |
results/nn.html |
Exported HTML version of the notebook for grading and readability. |
tests/test_nn.py |
Provided test file used to validate correctness using pytest. |
README.md |
This file, describing objectives, setup, and submission instructions. |
requirements.txt |
Python dependencies for reproducibility. |
.gitignore |
Specifies intentionally untracked files to keep the repository clean. |
The SimpleNetwork class implements a basic feedforward neural network where all units use a sigmoid activation function.
It supports both forward propagation and training via backpropagation using gradient descent.
Implemented Methods:
__init__(self, *layer_weights)– Initializes the network with a list of weight matrices.predict(self, input_matrix)– Performs forward propagation and returns continuous outputs.predict_zero_one(self, input_matrix)– Converts outputs to binary predictions (0 or 1).gradients(self, input_matrix, output_matrix)– Computes gradients for each weight matrix via backpropagation.train(self, input_matrix, output_matrix, iterations, learning_rate)– Updates weights using gradient descent.
This project reuses the shared environment from previous assignments.
# Activate your existing environment
source ~/venvs/ml-env/bin/activate
# Navigate to the assignment directory
cd ~/projects/info527-neural-networks-assignment2
# Install dependencies
pip install -r requirements.txtrequirements.txt
numpy>=1.24
pytest>=7.0
jupyter>=1.0
ipykernel>=6.0
-
Run all cells in
nn.ipynbto complete the implementation. -
Export the notebook as:
nn.ipynbnn.htmlnn.py
-
Place the exported
nn.pyfile in the same directory astest_nn.py. -
Run pytest to verify correctness:
pytest tests/test_nn.pyExpected Output:
============================= test session starts ==============================
collected 5 items
test_nn.py ..... [100%]
============================== 5 passed in 0.47s ===============================
Submit the following for grading:
nn.ipynbnn.htmlnn.pyback_propagation.ipynb(with pytest results)
Grading is based solely on the correctness of the code written in nn.ipynb.
| Method | Description | Marks |
|---|---|---|
__init__ |
Network initialization | 2 |
forward_propagation |
Compute layer activations | 3 |
predict |
Forward prediction | 2 |
predict_zero_one |
Binary conversion | 2 |
gradients |
Compute backpropagation gradients | 3 |
train |
Gradient descent weight updates | 3 |
| Total | 15 marks |
Repository name: info527-neural-networks-assignment2
Description: Implementation of a simple feedforward neural network trained via backpropagation. Includes gradient computation and model updates using NumPy and pytest validation. Part of the Master’s in MIS/ML program at the University of Arizona.
This repository was completed as part of INFO 527: Neural Networks, under the M.S. in Information Science and Machine Learning program (2023–2025).