Skip to content

Float16 support with loss / gradient scaling #2356

@wwwjn

Description

@wwwjn

-- From user feedback

While might sound counter intuitive, but fp16 is back! https://arxiv.org/abs/2510.26788
Supporting fp16 with proper scaling is essential for a more stable RL run. At the moment titan has no support for such a regime from our understanding.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions