-
Notifications
You must be signed in to change notification settings - Fork 1.9k
Open
Labels
custom gym envIssue related to Custom Gym EnvIssue related to Custom Gym Envmore information neededPlease fill the issue template completelyPlease fill the issue template completelyquestionFurther information is requestedFurther information is requested
Description
❓ Question
Hello,
I am saving my DQN model to file and noticed it is taking up about 30 GB of disk space. When I unzip the folder, I noticed that over 28 GB is coming from the "data" object within the zip folder. This is listed on the SB3 docs as: "a JSON file of class-parameters (dictionary)". The actual pytorch model is under 1 GB. It seems like something is being serialized in my implementation that shouldn't be. This is about the same size as the replay buffer since I am using DRL with images. I noticed that when I train the same code with a different observation space (some small vectors instead of images) that the size of the zip file goes back to below a gig. So it seems related to the observation space some how. The file is too big to open.
Checklist
- I have checked that there is no similar issue in the repo
- I have read the documentation
- If code there is, it is minimal and working
- If code there is, it is formatted using the markdown code blocks for both code and stack traces.
Metadata
Metadata
Assignees
Labels
custom gym envIssue related to Custom Gym EnvIssue related to Custom Gym Envmore information neededPlease fill the issue template completelyPlease fill the issue template completelyquestionFurther information is requestedFurther information is requested