Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cryptic error messages #27

Closed
ThijsvdLaar opened this issue Feb 22, 2019 · 3 comments · Fixed by #204
Closed

Cryptic error messages #27

ThijsvdLaar opened this issue Feb 22, 2019 · 3 comments · Fixed by #204
Assignees

Comments

@ThijsvdLaar
Copy link
Collaborator

In order to improve the usability of ForneyLab it is important to return informative error messages. Often the search for the true origin of an error requires so much in-depth knowledge of ForneyLab internals that it becomes impossible to decrypt. Let's collect such errors here, together with a short description of how they arised. Separate issues or pull requests can be opened for improvement proposals.

@ThijsvdLaar ThijsvdLaar self-assigned this Apr 9, 2019
@ivan-bocharov
Copy link
Collaborator

When a factor graph contains a dangling edge, constructing a recognition factorization results in a following error message:

LoadError: type Nothing has no field node
Stacktrace:
 [1] getproperty(::Any, ::Symbol) at ./sysimg.jl:18
 [2] nodes(::Set{Edge}) at ~/.julia/dev/ForneyLab/src/factor_graph.jl:108
 [3] (::getfield(ForneyLab, Symbol("##RecognitionFactor#137#139")))(::RecognitionFactorization, ::Symbol, ::Type, ::Set{Variable}) at ~/.julia/dev/ForneyLab/src/algorithms/variational_bayes/recognition_factorization.jl:36

...

An example to reproduce the error:

using ForneyLab

g = FactorGraph()

@RV a ~ GaussianMeanVariance(0.0,1.0)
@RV b ~ GaussianMeanVariance(a,1.0)
@RV c ~ GaussianMeanVariance(b,1.0)
@RV d ~ GaussianMeanVariance(c,1.0)

q = RecognitionFactorization(a,b,c)

@ismailsenoz
Copy link
Contributor

ismailsenoz commented Apr 26, 2019

When marginal rule is not specified while constructing a structured algorithm, I get the following error message:

, Edges:
Edge belonging to variable x_t: ( gpc_1.i[out] )----( gaussianmeanvariance_3.i[m] ).
Edge belonging to variable x_t_min: ( gaussianmeanvariance_2.i[out] )----( gpc_1.i[m] ).
) with inbound types Type[Message{GaussianMeanVariance,var_type} where var_type<:VariateType, Message{GaussianMeanVariance,var_type} where var_type<:VariateType, ProbabilityDistribution]```

@wmkouw
Copy link
Member

wmkouw commented Dec 4, 2019

I'm building a 2D Gaussian mixture model demo. This is the current setup:

Specify generative and recognition models

@RV _pi ~ Dirichlet([1.0, 1.0])
@RV m_1 ~ GaussianMeanVariance([0.0, 0.0], 100*[1. 0.; 0. 1.])
@RV w_1 ~ Wishart([1. 0.; 0. 1.], 2.)
@RV m_2 ~ GaussianMeanVariance([0.0, 0.0], 100*[1. 0.; 0. 1.])
@RV w_2 ~ Wishart([1. 0.; 0. 1.], 2.)

z = Vector{Variable}(undef, N)
y = Vector{Variable}(undef, N)
for i = 1:N
    @RV z[i] ~ Categorical(_pi)
    @RV y[i] ~ GaussianMixture(z[i], m_1, w_1, m_2, w_2)
    
    placeholder(y[i], :y, dims=(2,), index=i)
end

q = RecognitionFactorization(_pi, m_1, w_1, m_2, w_2, z, ids=[:PI, :M1, :W1, :M2, :W2, :Z])
algo = variationalAlgorithm(q)
algo_F = freeEnergyAlgorithm(q);

Execute inference algorithm

data = Dict(:y => X)

marginals = Dict(:_pi => ProbabilityDistribution(Dirichlet, a=[1.0, 1.0]),
                 :m_1 => ProbabilityDistribution(Multivariate, GaussianMeanVariance, m=[-1.0, -1.0], v=1e4*[1. 0.;0. 1.]),
                 :w_1 => ProbabilityDistribution(MatrixVariate, Wishart, v=[1. 0.;0. 1.], nu=2.),
                 :m_2 => ProbabilityDistribution(Multivariate, GaussianMeanVariance, m=[1.0, 1.0], v=1e4*[1. 0.;0. 1.]),
                 :w_2 => ProbabilityDistribution(MatrixVariate, Wishart, v=[1. 0.;0. 1.], nu=2.))
for i = 1:N
    marginals[:z_*i] = ProbabilityDistribution(Categorical, p=[1/2., 1/2.])
end

# Execute algorithm
num_iterations = 10
F = Float64[]
for i = 1:num_iterations
    stepZ!(data, marginals)
    stepPI!(data, marginals)
    stepM1!(data, marginals)
    stepW1!(data, marginals)
    stepM2!(data, marginals)
    stepW2!(data, marginals)
    
    push!(F, freeEnergy(data, marginals))
end

The error message I get is:

TypeError: in keyword argument m, expected Array{T,1} where T, got Float64

Stacktrace:
 [1] (::getfield(Core, Symbol("#kw#Type")))(::NamedTuple{(:m,),Tuple{Float64}}, ::Type{ProbabilityDistribution}, ::Type{Multivariate}, ::Type{PointMass}) at ./none:0
 [2] stepZ!(::Dict{Symbol,LinearAlgebra.Adjoint{Float64,Array{Float64,2}}}, ::Dict{Symbol,ProbabilityDistribution}, ::Array{Message,1}) at ./none:6
 [3] stepZ!(::Dict{Symbol,LinearAlgebra.Adjoint{Float64,Array{Float64,2}}}, ::Dict{Symbol,ProbabilityDistribution}) at ./none:5
 [4] top-level scope at ./In[54]:17

If you open up algo, you'll see:

begin

function stepZ!(data::Dict, marginals::Dict=Dict(), messages::Vector{Message}=Array{Message}(undef, 20))

messages[1] = ruleVBCategoricalOut(nothing, marginals[:_pi])
messages[2] = ruleVBGaussianMixtureZBer(ProbabilityDistribution(Multivariate, PointMass, m=data[:y][9]), nothing, marginals[:m_1], marginals[:w_1], marginals[:m_2], marginals[:w_2])
messages[3] = ruleVBCategoricalOut(nothing, marginals[:_pi])
...

So, GaussianMixture still thinks z is Bernoulli distributed (it's calling ruleVBGaussianMixtureZBer), even though it was specified to be Categorical. This is not obvious from the error message.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants