Skip to content

Commit c22c66d

Browse files
committed
update images
1 parent 9b9f50c commit c22c66d

File tree

5 files changed

+2
-3
lines changed

5 files changed

+2
-3
lines changed

code/example_introduction/gen_image.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -66,6 +66,6 @@
6666
plt.savefig(outputFolder + "/example_J" + ".png",facecolor=fig.get_facecolor())
6767

6868

69-
ax.set_title('Fonctions objectives global and locales $J^\star$ and $J_i^\star$', fontsize=16,usetex=True)
69+
ax.set_title('Fonctions objectif global et locales $J^\star$ and $J_i^\star$', fontsize=16,usetex=True)
7070
plt.savefig(outputFolder + "/example_J_fr" + ".pdf",facecolor=fig.get_facecolor())
7171
plt.savefig(outputFolder + "/example_J_fr" + ".png",facecolor=fig.get_facecolor())
0 Bytes
Binary file not shown.
-5 Bytes
Binary file not shown.
1 Byte
Loading

tex/main_matter/safe_dmpc_ineq.tex

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -683,7 +683,6 @@ \section{Adapting detection and mitigation methods}\label{sec:detection-mitigati
683683
We propose to use a similar strategy to the one used in the last chapter.
684684
However, some difficulties arise.
685685
We do not have one tuple $(\Plin,\sik)$ for each subsystem, but we have $2^{\predhorz c}$ different tuples.
686-
687686
Here follow proposals on how to counter the difficulties.
688687

689688
\subsection{Mitigation and artificial scarcity}\label{sec:mitigation_ineq}
@@ -842,7 +841,7 @@ \subsection{Multiple Parameter Estimation}\label{sec:cons-about-mult}
842841
In the next subsection we describe the method more precisely.
843842

844843
\subsubsection{Expectation Maximization}
845-
The main objective of the \EM{} algorithm is to find, from a set of observable data $\set{B}$, estimators of a set of parameters $\set{P}$ that maximize the log marginal likelihood of the observed data ${\ln\probability{\set{B};\set{P}}}$. The models generally have latent variables (unobservable) in a set $\set{U}$.
844+
The objective of the \EM{} algorithm is to find, from a set of observable data $\set{B}$, estimators of a set of parameters $\set{P}$ that maximize the log marginal likelihood of the observed data ${\ln\probability{\set{B};\set{P}}}$. The models generally have latent variables (unobservable) in a set $\set{U}$.
846845

847846
The problem is that maximizing ${\ln\probability{\set{B};\set{P}}}$ does not have an analytical solution.
848847
So, the algorithm solves the optimization problem in a iterative manner.

0 commit comments

Comments
 (0)