Skip to content

[Bug]: Dropout Probability (under training tab) is likely flawed for newer models #957

@dxqb

Description

@dxqb

What happened?

The dropout probability on the training tab (not text augmentation):

Image

Likely not functioning as intended on many newer models for the same reasons as described here: #950

A 0-vector embedding is not the same as a dropped out caption.

Documenting this now, because I am going to merge the Chroma PR with this bug still present there - because it affects many or all models, not only Chroma:

# apply dropout FIXME not a real dropout

What did you expect would happen?

Dropout was probably intended as training the unconditional. This might depend on the model, but in mandy models this is the text encoder result of an empty prompt - not a 0-vector embedding

Relevant log output

Generate and upload debug_report.log

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions