Replies: 3 comments 2 replies
-
|
From profiling, it looks like the slowness comes from |
Beta Was this translation helpful? Give feedback.
-
|
Hi, just to add to @wlandau's assessment here. The philosophy of geotargets results in essentially copying your file sources to the target store... this means that you are re-writing your external file This does result in some write overhead, but we think it improves reproducibility in the general case In your case the overhead is perhaps unneeded because you aren't processing the source file in any way--but in the event you were doing some calculations on it you would want to be able to store those results separately in the target store. To Will's point, there are some options you can tinker with in terms of arguments passed to
Alternately, if you are willing to have your pipeline depend on the external file, you could consider using tar_terra_vrt(load_height,
terra::rast(height_file),
repository = "local",
)We originally intended If you run into any issues with any of the above, please let us know over on geotargets issues: https://github.com/ropensci/geotargets/issues |
Beta Was this translation helpful? Give feedback.
-
|
Thanks a lot @wlandau @brownag for your quick answer and the solution proposed. I'm a little lost with the different option to manage data sources when theyr are needed during dynamic branching, over crew workers. To better illustrate the workflow :
|
Beta Was this translation helpful? Give feedback.

Uh oh!
There was an error while loading. Please reload this page.
-
Help
Description
Hi,
I have some difficulties to found the good way to load and use large raster with targets, geotargets and crew.
Using this simple plan, and a "not so big" 3GB raster, i takes looooooot of time for load_height target to complete.
The raster is available here :
https://ent.normandie-univ.fr/filex/get?k=N6OrrcT3OB54wxz3Xys
Is there some way to reduce the process of rast()
Any help appreciated :)
Src
Beta Was this translation helpful? Give feedback.
All reactions