Skip to content

Handling dropouts #11

@timholy

Description

@timholy

In cases of poor initialization, some components of the mixture may drop out. For example, let's create a 2-component mixture that is very poorly initialized:

julia> X = randn(10);

julia> mix = MixtureModel([Normal(100, 0.001), Normal(200, 0.001)], [0.5, 0.5]);

julia> logpdf.(components(mix), X')
2×10 Matrix{Float64}:
 -4.92479e9   -4.97741e9   -5.02964e9   -5.15501e9   -5.05792e9     -5.16391e9   -4.88617e9   -4.93348e9   -5.09162e9
 -1.98493e10  -1.99548e10  -2.00592e10  -2.03088e10  -2.01157e10     -2.03265e10  -1.97717e10  -1.98667e10  -2.01828e10

You can see that both have poor likelihood, but one of the two always loses by a very large margin. Then when we go to optimize,

julia> fit_mle(mix, X)
ERROR: DomainError with NaN:
Normal: the condition σ >= zero(σ) is not satisfied.
Stacktrace:
  [1] #371
    @ ~/.julia/dev/Distributions/src/univariate/continuous/normal.jl:37 [inlined]
  [2] check_args
    @ ~/.julia/dev/Distributions/src/utils.jl:89 [inlined]
  [3] #Normal#370
    @ ~/.julia/dev/Distributions/src/univariate/continuous/normal.jl:37 [inlined]
  [4] Normal
    @ ~/.julia/dev/Distributions/src/univariate/continuous/normal.jl:36 [inlined]
  [5] fit_mle
    @ ~/.julia/dev/Distributions/src/univariate/continuous/normal.jl:229 [inlined]
  [6] fit_mle(::Type{Normal{Float64}}, x::Vector{Float64}, w::Vector{Float64}; mu::Float64, sigma::Float64)
    @ Distributions ~/.julia/dev/Distributions/src/univariate/continuous/normal.jl:256
  [7] fit_mle
    @ ~/.julia/dev/Distributions/src/univariate/continuous/normal.jl:253 [inlined]
  [8] fit_mle
    @ ~/.julia/dev/ExpectationMaximization/src/that_should_be_in_Distributions.jl:17 [inlined]
  [9] (::ExpectationMaximization.var"#2#3"{Vector{Normal{Float64}}, Vector{Float64}, Matrix{Float64}})(k::Int64)
    @ ExpectationMaximization ./none:0
 [10] iterate(::Base.Generator{Vector{Any}, DualNumbers.var"#1#3"})
    @ Base ./generator.jl:47 [inlined]
 [11] collect_to!(dest::AbstractArray{T}, itr::Any, offs::Any, st::Any) where T
    @ Base ./array.jl:890 [inlined]
 [12] collect_to_with_first!(dest::AbstractArray, v1::Any, itr::Any, st::Any)
    @ Base ./array.jl:868 [inlined]
 [13] collect(itr::Base.Generator{UnitRange{Int64}, ExpectationMaximization.var"#2#3"{Vector{…}, Vector{…}, Matrix{…}}})
    @ Base ./array.jl:842
 [14] fit_mle!::Vector{…}, dists::Vector{…}, y::Vector{…}, method::ClassicEM; display::Symbol, maxiter::Int64, atol::Float64, robust::Bool)
    @ ExpectationMaximization ~/.julia/dev/ExpectationMaximization/src/classic_em.jl:48
 [15] fit_mle!
    @ ~/.julia/dev/ExpectationMaximization/src/classic_em.jl:14 [inlined]
 [16] fit_mle(::MixtureModel{…}, ::Vector{…}; method::ClassicEM, display::Symbol, maxiter::Int64, atol::Float64, robust::Bool,
 infos::Bool)
    @ ExpectationMaximization ~/.julia/dev/ExpectationMaximization/src/fit_em.jl:30
 [17] fit_mle(::MixtureModel{Univariate, Continuous, Normal{Float64}, Categorical{Float64, Vector{Float64}}}, ::Vector{Float64})
    @ ExpectationMaximization ~/.julia/dev/ExpectationMaximization/src/fit_em.jl:12
 [18] top-level scope
    @ REPL[8]:1
Some type information was truncated. Use `show(err)` to see complete types.

This arises because α[:] = mean(γ, dims = 1) returns α = [1.0, 0.0]. In other words, component 2 of the mixture "drops out."

I've found errors like these, as well as positive-definiteness errors in a multivariate context, to be pretty ubiquitous when fitting complicated distributions and point-clouds. To me it seems we'd need to come up with some kind of guard against this behavior? But I'm not sure what the state-of-the-art approach is, or I'd implement it.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions