Hi,
I'm using the Python Redmine library to fetch time entries like this:
entries = client.time_entry.filter(offset=0, limit=100)
But each request (100 entries) takes ~50–60 seconds, which becomes very slow when trying to fetch all ~53,000 records (takes hours).
There’s no network issue on my side, so the delay seems to be from either the Redmine API response or how the RedmineLib handles bulk pagination.
Is there any efficient way to:
Speed up time_entry fetching?
Fetch all 53k entries with less overhead?
Or cache responses locally and reprocess later?
Any advice or alternate approach would be appreciated.
Thanks!