Sharing targets through S3 #852
Replies: 2 comments 5 replies
-
|
ropensci-books/targets@807bcde may help a bit (see also https://books.ropensci.org/targets/data.html#local-data-store and #851). When a target is in the cloud (e.g. |
Beta Was this translation helpful? Give feedback.
-
|
@wlandau: Thank you! I will give it a try! Regarding tracking it via |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi folks,
First of all: great package! Thanks for creating targets! 😊
Maybe you can help with the following challenge: I have two pipelines,
aandb, and I would like to use data of pipelineawithinb. Ideally I'd like to use S3 as a backend to share the data as files. I set up the S3 connection as described in the docs, and it works properly. The docs seem to suggest loading the rds files from the other storage as well, but not using some cloud storage as a backend.I have come up with the following non-working minimal example:
a.R:b.R:aruns charmingly viatar_make(script = "b.R", store = "store-a").I can see the
output_filein the S3 bucket (as_targets/objects/output_file, no reference tostore-a).Now running
tar_make(script = "b.R", store = "store-b")throws an error (coming fromtar_assert_path). It seems like I cannot reference the other file:I have also tried other input file paths, without success.
Given that error, it seems like I am not fully understanding how files and the cloud backends are supposed to work. I saw that you had a discussion about similar issues in the past, as in here.
I would appreciate a hint how to fix this or a suggestions for a workaround. I can also live with a "don't use it like this" though.
Thanks a lot,
Seb
Beta Was this translation helpful? Give feedback.
All reactions