Foundational knowledge for writing PyMC 5 models including syntax, distributions, sampling, and ArviZ diagnostics. Use when creating or reviewing PyMC models.
View on GitHubchoxos/BayesianAgent
bayesian-modeling
plugins/bayesian-modeling/skills/pymc-fundamentals/SKILL.md
January 21, 2026
Select agents to install to:
npx add-skill https://github.com/choxos/BayesianAgent/blob/main/plugins/bayesian-modeling/skills/pymc-fundamentals/SKILL.md -a claude-code --skill pymc-fundamentalsInstallation paths:
.claude/skills/pymc-fundamentals/# PyMC 5 Fundamentals
## When to Use This Skill
- Writing new PyMC models in Python
- Understanding PyMC syntax and API
- Converting models from Stan/JAGS to PyMC
- Diagnosing sampling issues with ArviZ
## Model Structure
```python
import pymc as pm
import numpy as np
import arviz as az
with pm.Model() as model:
# 1. Priors
mu = pm.Normal("mu", mu=0, sigma=10)
sigma = pm.HalfNormal("sigma", sigma=1)
# 2. Likelihood
y_obs = pm.Normal("y_obs", mu=mu, sigma=sigma, observed=y_data)
# 3. Sample
trace = pm.sample(1000, tune=1000, return_inferencedata=True)
# 4. Diagnostics
az.summary(trace)
```
## CRITICAL: SD Parameterization
**PyMC uses SD (like Stan), NOT precision (like BUGS):**
```python
# PyMC (SD)
pm.Normal("x", mu=0, sigma=1) # sigma is SD
# BUGS equivalent would be tau = 1/sigma² = 1
```
## Distribution Quick Reference
### Continuous
```python
pm.Normal("x", mu=0, sigma=1) # Normal
pm.HalfNormal("x", sigma=1) # Half-normal (>0)
pm.HalfCauchy("x", beta=2.5) # Half-Cauchy (>0)
pm.Exponential("x", lam=1) # Exponential
pm.Uniform("x", lower=0, upper=1) # Uniform
pm.Beta("x", alpha=1, beta=1) # Beta
pm.Gamma("x", alpha=2, beta=1) # Gamma
pm.StudentT("x", nu=3, mu=0, sigma=1) # Student-t
pm.LogNormal("x", mu=0, sigma=1) # Log-normal
pm.TruncatedNormal("x", mu=0, sigma=1, lower=0) # Truncated
```
### Discrete
```python
pm.Bernoulli("x", p=0.5) # Bernoulli
pm.Binomial("x", n=10, p=0.5) # Binomial
pm.Poisson("x", mu=5) # Poisson
pm.NegativeBinomial("x", mu=5, alpha=1) # Negative binomial
pm.Categorical("x", p=[0.3, 0.5, 0.2]) # Categorical
```
### Multivariate
```python
pm.MvNormal("x", mu=np.zeros(K), cov=np.eye(K))
pm.Dirichlet("x", a=np.ones(K))
pm.LKJCholeskyCov("chol", n=K, eta=2, sd_dist=pm.Exponential.dist(1))
```
## Sampling
```python
# Standard NUTS
trace = pm.sample(
draws=1000,