-
Notifications
You must be signed in to change notification settings - Fork 2
Description
https://discourse.canceridc.dev/t/direct-download-from-the-portal-does-not-work/761
@s-paquette in https://nciimagingdat-m8a6349.slack.com/archives/C045VQUAZ0S/p1761345921749809:
confirmed the issue is a path length issue; if a user's max folder path length isn't incremented sufficiently the error is rather opaque. I'm going to put in a solution to detect it, but in this specific case, I suspect anyone on a Mac is unlikely to run into this because IIRC OSX's default path length max is generous (like 1024 characters or similar) but in Windows it's 260.
6:45
And the full path length of these files (collection+case+study+series+instace) is around 270+.
6:46
(I tried on 2 Windows 11 machines and also encountered the error; I suspect Steve is on Linux or OSX.)
Since we don't know the absolute path to the download folder (it's considered by the browser to be confidential, so it's not exposed to the web page code) there's no way for us to know in advance if any given suffix path we add will exceed the system's path limit. What we can and should do is check that the file was downloaded and saved correctly by re-reading it and verifying the checksum against the data in the bucket. This can be streamed so no need to hold the full file in memory and it is very fast compared to the download so virtually no overhead. If the hashes don't match we can let the user know and suggest that it might be the file path length.
You can get the checksum for objects in s3 automatically (from s3) and then compare them with what you calculate locally.