ConfigTypeError when using Optuna for hyperparameter tuning in Google Colab environment #1697
Unanswered
LeonardAndreasNapitupulu
asked this question in
Q&A - get help using NeuralProphet
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hello NeuralProphet team and community,
I'm currently working on a project using neuralprophet for time series forecasting and I'm trying to integrate optuna for hyperparameter tuning. However, I'm consistently running into a persistent ConfigTypeError that seems to be related to dependency conflicts within the Google Colab environment.
I would be very grateful for any guidance you could provide.
The Problem
When I run study.optimize(), the first trial immediately fails with the following error:
ConfigTypeError: Too few arguments for typing.Dict; actual 1, expected 2
This error occurs deep within the pytorch-lightning and omegaconf libraries during the model.fit() call inside my Optuna objective function.
Are there any other method tuning hyperparameters for NeuralProphet that you would recommend ?"
Beta Was this translation helpful? Give feedback.
All reactions