I am aware that the default inference implemented is based on Mean squared error (MSE) loss. Is there an implemented example or a way to obtain aleatoric uncertainty instead (either homoscedestic or heteroscedestic)? i.e. learning to output the variance of an isotropic Gaussian distribution.