What happened?
The dropout probability on the training tab (not text augmentation):
Likely not functioning as intended on many newer models for the same reasons as described here: #950
A 0-vector embedding is not the same as a dropped out caption.
Documenting this now, because I am going to merge the Chroma PR with this bug still present there - because it affects many or all models, not only Chroma:
|
# apply dropout FIXME not a real dropout |
What did you expect would happen?
Dropout was probably intended as training the unconditional. This might depend on the model, but in mandy models this is the text encoder result of an empty prompt - not a 0-vector embedding
Relevant log output
Generate and upload debug_report.log
No response
What happened?
The dropout probability on the
trainingtab (nottext augmentation):Likely not functioning as intended on many newer models for the same reasons as described here: #950
A 0-vector embedding is not the same as a dropped out caption.
Documenting this now, because I am going to merge the Chroma PR with this bug still present there - because it affects many or all models, not only Chroma:
OneTrainer/modules/model/ChromaModel.py
Line 210 in 9f4f666
What did you expect would happen?
Dropout was probably intended as training the unconditional. This might depend on the model, but in mandy models this is the text encoder result of an empty prompt - not a 0-vector embedding
Relevant log output
Generate and upload debug_report.log
No response