feat: add multipart encoder when uploading (large) files#88
Merged
suricactus merged 1 commit intomasterfrom Oct 22, 2025
Merged
feat: add multipart encoder when uploading (large) files#88suricactus merged 1 commit intomasterfrom
suricactus merged 1 commit intomasterfrom
Conversation
Collaborator
d5a44ee to
1e4cb48
Compare
lukasgraf
approved these changes
Oct 21, 2025
lukasgraf
left a comment
There was a problem hiding this comment.
LGTM! 👍 🎉
I also took this opportunity to familiarize myself a bit more with qfieldcloud-sdk-python, and checked out your branch and tested it locally.
Works like a charm, as far as I can tell. The progress bar now moves a lot more naturally, for large uploads, compared to before.
Though it's still not perfectly in sync: After uploading a 9GB file, the progress bar finishes and the output gets stuck at Uploading "9GB.dat"...: 9.66GB [02:09, 74.5MB/s] for quite a while. But I'm pretty sure that is unavoidable because during that time
- the file is streamed from nginx's buffer to Django
- processed by Django
- committed to storage
- and only then will Django answer with a response + status code.
So I think for huge files like that there is always going to be some delay on the "last mile".
With the previous implementation of `upload_files`, the SDK was loading the whole file to be uploaded in the memory and only then sending it over the socket. This was painfully slow, memory constrained and lacked real feedback. Therefore the `requests_toolbelt`'s `MultipartEncoderMonitor` is introduced in this PR, which keeps the memory consumption under control. This is quite useful not only for CLI consumers of the SDK, but also for QFieldCloud `qgis` workers, as the memory does not grow too much when uploading large files after `package` or `apply_deltas` job.
1e4cb48 to
c8f94f1
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Also added a bit of typingWith the previous implementation of
upload_files, the SDK was loadingthe whole file to be uploaded in the memory and only then sending it
over the socket. This was painfully slow, memory constrained and lacked
real feedback.
Therefore the
requests_toolbelt'sMultipartEncoderMonitorisintroduced in this PR, which keeps the memory consumption under control.
This is quite useful not only for CLI consumers of the SDK, but also for
QFieldCloud
qgisworkers, as the memory does not grow too much whenuploading large files after
packageorapply_deltasjob.