Linear Algebra – A Geometric Perspective
A crash course starting from the geometric viewpoint
A crash course starting from the geometric viewpoint
A two-part crash course covering numerical methods for ODEs (relevant to gradient flow and momentum) and numerical linear algebra (for conditioning, iterative solvers, and preconditioning) essential for modern optimization.
An introduction to the core concepts of functional analysis essential for understanding optimization theory in machine learning.
A concise review of essential multivariable calculus concepts vital for understanding mathematical optimization, including partial derivatives, gradients, Hessians, and Taylor series.
An essential crash course on convex sets, functions, subdifferential calculus, duality, and optimization algorithms, forming a crucial foundation for understanding optimization in machine learning.
An intuition-first introduction to manifolds, metrics, curvature, and their applications in understanding machine learning optimization.
A crash course introducing the geometric structure of statistical models, the Fisher Information Metric, dual connections, and the natural gradient.
A foundational journey into sequential decision-making, regret minimization, and adaptive optimization algorithms.
A crash course on the calculus of functionals, exploring how to optimize entire functions. Covers the Euler-Lagrange equation, connections to physics (Lagrangian/Hamiltonian), the Legendre transform, and applications to classic optimization...
A crash course on tensor calculus, focusing on definitions, notation, and operations essential for understanding advanced machine learning and optimization techniques in high-dimensional spaces.
A crash course on essential concepts from statistics and information theory, crucial for understanding modern machine learning and optimization.