Skip to content

add run! #5

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 1 commit into
base: master
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
50 changes: 47 additions & 3 deletions src/ApproxInferenceBase.jl
Original file line number Diff line number Diff line change
@@ -1,5 +1,49 @@
module ApproxInferenceBase
using Distributions
using Random
include("priors.jl")
using Distributions
using Random
export run!
include("priors.jl")

"""
run!(method, model, data;
verbosity = 0, callback = () -> nothing, rng = Random.GLOBAL_RNG)

Run approximate inference `method` on `model` and `data`.
The `model` should be a callable object (function or functor) with one argument
and return something that can be compared to the `data`. The comparison metric is
defined in the `method`.
Handling of constants and extraction of summary statistics should be done in
the `model` (see examples below).
Verbosity levels are `verbosity = 0` (silent), `verbosity = 1` (progress),
`verbosity = 2` (detailed).
Callbacks `callback` are callable objects with no argument that are called after
every iteration of an iterative `method`. Custom random number generators can be
given through the argument `rng`.

# Model examples
```
# simple model
model(params) = sum(params)

# complex model with constants
complex_model(params, constants) = sum(params) + sum(constants)
model(params) = let constants = [1, 2, 3]
complex_model(params, constants)
end

# extracting summary statistics
raw_model(params) = rand(4, 3)
model(params) = mean(raw_model(params), dims = 2)

# functor
struct Model
options
end
(m::Model)(params) = sum(params) + sum(m.options)
```
"""
function run!(method, model, data;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm starting to believe that maybe it is better not to talk about data at all, perhaps it is best to ask users to directly return a cost (or a vector of costs if it makes sense for some algorithms) in this way data could in principle change during inference in a user defined way, what do you think?

verbosity = 0, callback = () -> nothing, rng = Random.GLOBAL_RNG)
throw(MethodError(run!, (method, model, data)))
end
end # module