Skip to content

[Bug]: Batch Relaxer is missing gradient for converged check #127

@luisbro

Description

@luisbro

Contact Details

No response

Bug Description

Using the MatterGen evaluation script call MatterSim's BatchRelaxer. This leads to the error TypeError: Optimizer.converged() missing 1 required positional argument: 'gradient'. The issue lies with ase, which changed their API and now require the gradient argument. An issue has been raised to ase's gitlab, where it seems like it'll be discussed if this chage should be reverted.

MatterSim Version

v1.1.2

Python Version

3.10.14

Reproduction Steps

  1. Install MatterSim as one of MatterGen's dependencies
    • This installs an older version v1.1.2, but the issue likely affects newer ones as well
  2. Run MatterGen's evaluation script

Expected Behavior

MatterSim relaxes the given structures

Actual Behavior

Early termination of script with error TypeError: Optimizer.converged() missing 1 required positional argument: 'gradient', due to an argument missing here:

if opt.converged() or opt.nsteps >= self.max_n_steps:
.
Workaround / fix

gradient = opt.optimizable.get_gradient()
if opt.converged(gradient) or opt.nsteps >= self.max_n_steps:

Error Logs

Code of Conduct

  • I agree to follow the project's Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions