\begin{frame} \frametitle{Error minimization by gradient descent}
\begin{itemize}
\item MSE minimization leads to a quadratic cost function; in that case (and in that case only) the location of the minimum is available in closed form as $\mathbf{h} = \mathbf{R}^{-1}\mathbf{g}$
\item most machine learning problems involve more complicated loss functions
\item how can we try to find a minimum in these cases?
\end{itemize}
\centering
\vspace{2em}
iteratively, via gradient descent
\end{frame}
\begin{frame} \frametitle{Gradient descent}
problem setup:
\begin{itemize}
\item assume $f(\mathbf{t})$ is a differentiable multivariate function ($\mathbf{t} = \begin{bmatrix}t_0 & t_1 & \ldots & t_D\end{bmatrix}$)