Creating In-Memory dataset in GCP bucket #7630
Sanjaykrishnamurthy
started this conversation in
General
Replies: 1 comment 5 replies
-
Can you clarify the issue? Please note that |
Beta Was this translation helpful? Give feedback.
5 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
I am reading custom dataset in PyG dataloader from GCP bucket like this. However the proprocessed files get saved in local instead of in remote GCP bucket. How to override the processed_file_names function to have it save in GCP bucket. Here is the snippet of my code.
`
from google.cloud import storage
fs = gcsfs.GCSFileSystem()
`
Beta Was this translation helpful? Give feedback.
All reactions