@@ -137,6 +137,7 @@ Create a loss function for given
137137- g(x, ϕ): machine learning model
138138- transM: transforamtion of parameters at unconstrained space
139139- f(θMs, θP): mechanistic model
140+ - py: `function(y_pred, y_obs, y_unc)` to compute negative log-likelihood, i.e. cost
140141- intϕ: interpreter attaching axis with components ϕg and ϕP
141142- intP: interpreter attaching axis to ζP = ϕP with components used by f,
142143 The default, uses `intϕ(ϕ)` as a template
@@ -160,7 +161,7 @@ and returns a NamedTuple of
160161- `neg_log_prior`: negative log-prior of `θMs` and `θP`
161162- `neg_log_prior`: negative log-prior of `θMs` and `θP`
162163"""
163- function get_loss_gf (g, transM, transP, f,
164+ function get_loss_gf (g, transM, transP, f, py,
164165 intϕ:: AbstractComponentArrayInterpreter ,
165166 intP:: AbstractComponentArrayInterpreter = ComponentArrayInterpreter (
166167 intϕ (1 : length (intϕ)). ϕP);
@@ -178,7 +179,6 @@ function get_loss_gf(g, transM, transP, f,
178179 # , intP = get_concrete(intP)
179180 # inv_transP = inverse(transP), kwargs = kwargs
180181 function loss_gf (ϕ, xM, xP, y_o, y_unc, i_sites)
181- σ = exp .(y_unc ./ 2 )
182182 ϕc = intϕ (ϕ)
183183 # μ_ζP = ϕc.ϕP
184184 # xMP = _append_each_covars(xM, CA.getdata(μ_ζP), pbm_covar_indices)
@@ -190,7 +190,9 @@ function get_loss_gf(g, transM, transP, f,
190190 y_pred, θMs_pred, θP_pred = gf (
191191 g, transMs, transP, f, xM, xP, CA. getdata (ϕc. ϕg), CA. getdata (ϕc. ϕP),
192192 pbm_covar_indices; cdev, kwargs... )
193- nLy = sum (abs2, (y_pred .- y_o) ./ σ)
193+ # σ = exp.(y_unc ./ 2)
194+ # nLy = sum(abs2, (y_pred .- y_o) ./ σ)
195+ nLy = py ( y_pred, y_o, y_unc)
194196 # logpdf is not typestable for Distribution{Univariate, Continuous}
195197 logpdf_t = (prior, θ) -> logpdf (prior, θ):: eltype (θP_pred)
196198 logpdf_tv = (prior, θ:: AbstractVector ) -> begin
0 commit comments