Skip to content

Speed decreases significantly for real time inference #1

@Saqueeb

Description

@Saqueeb

Hi there,
Thank you for this amazing repo. I tried converting my custom object detection model using this repo and while testing I found out the inference speed decreased from 25-35 FPS to 1-2.5 FPS which is really poor. I need to convert the darknet model to tf js and for that, I tried converting it to the Tensorflow model first. Do have any idea how I can keep the speed and accuracy and convert the weights to Tensorflow/TF JS model?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions