A simple method to estimate uncertainty in Machine Learning
Estimate uncertainty by predicting the
quantiles of \(y\) given \(x\).
\[ \begin{aligned} E &= y - f(x) \\ L_q &= \begin{cases} q E, & E \gt 0 \\ (1 - q) (-E), & E \lt 0 \end{cases} \end{aligned} \]
\[ \begin{aligned} E &= y - f(x) \\ L_q &= \max \begin{cases} q E \\ (q - 1) E \end{cases} \end{aligned} \]
def quantile_loss(q, y_true, y_pred):
e = y_true - y_pred
return jnp.maximum(q * e, (q - 1.0) * e)
Loss landscape for a continous sequence of y_true
values between [10, 20]
.
class QuantileRegression(elegy.Module):
def __init__(self, n_quantiles: int):
super().__init__()
self.n_quantiles = n_quantiles
def call(self, x):
x = elegy.nn.Linear(128)(x)
x = jax.nn.relu(x)
x = elegy.nn.Linear(64)(x)
x = jax.nn.relu(x)
x = elegy.nn.Linear(self.n_quantiles)(x)
return x