Skip to content

Loss patterns in toy data for lora training. #208

@LearningLeopard

Description

@LearningLeopard

Hey OmniGen team,
Thank you for such easy to use fine tuning scripts. It was very easy to get started. I was playing around with your toy data and when I'm training the model with that, the loss patterns are fluctuating in a very unstable way. Now I know from general experience that it could mean that lr is high. Now even when I tried to reduce it, the loss history has not stabilized. Now I'm new to training T2I models and specially with lora, is this fluctuation normal or not? Basically, I am going to train with some of my own data and if doing so if I find again such similar loss fluctuation patterns, is it okay or should I play with the hyper parameters for more stable pattern?

Thank you for your inputs!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions