You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: tutorials/docs-12-using-turing-guide/using-turing-guide.jmd
+32-31
Original file line number
Diff line number
Diff line change
@@ -331,66 +331,67 @@ The element type of a vector (or matrix) of random variables should match the `e
331
331
332
332
### Querying Probabilities from Model or Chain
333
333
334
-
Consider first the following simplified `gdemo` model:
334
+
Turing offers three functions: [`loglikelihood`](https://turinglang.org/DynamicPPL.jl/dev/api/#StatsAPI.loglikelihood), [`logprior`](https://turinglang.org/DynamicPPL.jl/dev/api/#DynamicPPL.logprior), and [`logjoint`](https://turinglang.org/DynamicPPL.jl/dev/api/#DynamicPPL.logjoint) to query the log-likelihood, log-prior, and log-joint probabilities of a model, respectively.
335
+
336
+
Let's look at a simple model called `gdemo`:
335
337
336
338
```julia
337
-
@model function gdemo0(x)
339
+
@model function gdemo0()
338
340
s ~ InverseGamma(2, 3)
339
341
m ~ Normal(0, sqrt(s))
340
342
return x ~ Normal(m, sqrt(s))
341
343
end
342
-
343
-
# Instantiate three models, with different value of x
344
-
model1 = gdemo0(1)
345
-
model4 = gdemo0(4)
346
-
model10 = gdemo0(10)
347
344
```
348
345
349
-
Now, query the instantiated models: compute the likelihood of `x = 1.0` given the values of `s = 1.0` and `m = 1.0` for the parameters:
346
+
If we observe x to be 1.0, we can condition the model on this datum using the [`condition`](https://turinglang.org/DynamicPPL.jl/dev/api/#AbstractPPL.condition) syntax:
350
347
351
348
```julia
352
-
prob"x = 1.0 | model = model1, s = 1.0, m = 1.0"
349
+
model = gdemo0() | (x=1.0,)
353
350
```
354
351
352
+
Now, let's compute the log-likelihood of the observation given specific values of the model parameters, `s` and `m`:
353
+
355
354
```julia
356
-
prob"x = 1.0 | model = model4, s = 1.0, m = 1.0"
355
+
loglikelihood(model, (s=1.0, m=1.0))
357
356
```
358
357
358
+
We can easily verify that value in this case:
359
+
359
360
```julia
360
-
prob"x = 1.0 | model = model10, s = 1.0, m = 1.0"
361
+
logpdf(Normal(1.0, 1.0), 1.0)
361
362
```
362
363
363
-
Notice that even if we use three models, instantiated with three different values of `x`, we should obtain the same likelihood. We can easily verify that value in this case:
364
+
We can also compute the log-prior probability of the model for the same values of s and m:
Finally, we can compute the log-joint probability of the model parameters and data:
378
375
379
-
# Instantiate the model.
380
-
model = gdemo(2.0, 4.0)
376
+
```julia
377
+
logjoint(model, (s=1.0, m=1.0))
381
378
```
382
379
383
-
The following are examples of valid queries of the `Turing` model or chain:
380
+
```julia
381
+
logpdf(Normal(1.0, 1.0), 1.0) +
382
+
logpdf(InverseGamma(2, 3), 1.0) +
383
+
logpdf(Normal(0, sqrt(1.0)), 1.0)
384
+
```
384
385
385
-
- `prob"x = 1.0, y = 1.0 | model = model, s = 1.0, m = 1.0"` calculates the likelihood of `x = 1` and `y = 1` given `s = 1` and `m = 1`.
386
+
Querying with `Chains` object is easy as well:
386
387
387
-
- `prob"s² = 1.0, m = 1.0 | model = model, x = nothing, y = nothing"` calculates the joint probability of `s = 1` and `m = 1` ignoring `x` and `y`. `x` and `y` are ignored so they can be optionally dropped from the RHS of `|`, but it is recommended to define them.
388
-
- `prob"s² = 1.0, m = 1.0, x = 1.0 | model = model, y = nothing"` calculates the joint probability of `s = 1`, `m = 1` and `x = 1` ignoring `y`.
389
-
- `prob"s² = 1.0, m = 1.0, x = 1.0, y = 1.0 | model = model"` calculates the joint probability of all the variables.
390
-
- After the MCMC sampling, given a `chain`, `prob"x = 1.0, y = 1.0 | chain = chain, model = model"` calculates the element-wise likelihood of `x = 1.0` and `y = 1.0` for each sample in `chain`.
391
-
- If `save_state=true` was used during sampling (i.e., `sample(model, sampler, N; save_state=true)`), you can simply do `prob"x = 1.0, y = 1.0 | chain = chain"`.
388
+
```julia
389
+
chn = sample(model, Prior(), 10)
390
+
```
392
391
393
-
In all the above cases, `logprob` can be used instead of `prob` to calculate the log probabilities instead.
392
+
```julia
393
+
loglikelihood(model, chn)
394
+
```
394
395
395
396
### Maximum likelihood and maximum a posterior estimates
0 commit comments