Skip to content

Bug: inverse_sigmoid(1.0) produces Inf in densify_from_depth_propagation #87

@anton-brandl

Description

@anton-brandl

In scene/gaussian_model.py line 486, opacity is initialized as:

opacities = inverse_sigmoid(1.0 * torch.ones(...))                                                                                                                                                                                   

Since inverse_sigmoid(x) = log(x / (1-x)), when x=1.0 this produces Inf,
causing CUDA errors during training.

Fix: Use 0.999 instead of 1.0 to achieve near-full opacity without the singularity:
opacities = inverse_sigmoid(0.999 * torch.ones(...))

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions