Skip to content

Conversation

@FilippoOlivo
Copy link
Member

No description provided.

@FilippoOlivo FilippoOlivo marked this pull request as draft November 4, 2024 14:08
@FilippoOlivo FilippoOlivo marked this pull request as ready for review November 7, 2024 12:58
default :class:`torch.nn.MSELoss`.
"""
if optimizers is None:
optimizers = TorchOptimizer(torch.optim.Adam, lr=0.001)
Copy link
Collaborator

@dario-coscia dario-coscia Nov 11, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

why are you not passing kwargs?

Copy link
Member Author

@FilippoOlivo FilippoOlivo Nov 11, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

kwargs are passed when TorchOptimizer is initialized outside the Solver class

from pina.optim import TorchOptimizer
from pina.solver import PINN
from torch.optim import SGD

optimizer = TorchOptimizer(SGD, lr=0.1, ...) 

in_ = LabelTensor(torch.rand((10,2)), ['u_0', 'u_1'])
out_ = LabelTensor(torch.rand((10,1)), ['u'])

class TestProblem(AbstractProblem):
    input_variables = ['u_0', 'u_1']
    output_variables = ['u']

    conditions = {
        'data': Condition(input_points=in_, output_points=out_),
    }

solver = SupervisedSolver(problem=problem, model=model, optimizer=optimizer)

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we pass the standard torch optimizer and inside solvers do the wrapper? To me, for the user, this is more intuitive

Copy link
Collaborator

@dario-coscia dario-coscia Nov 15, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ndem0 what do you think?

@FilippoOlivo FilippoOlivo marked this pull request as draft November 14, 2024 13:51
@FilippoOlivo FilippoOlivo marked this pull request as ready for review November 19, 2024 13:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants