Graveyard

I wrote some additional text that did not fit with the main story in the end that much, or that would make it too fractal to follow. I can't really find the courage to delete it, hence this bonus content. If eight chapters about KL divergence and entropy don't add enough excitement to your life, check it out!

Bonus chapters

I wrote two chapters that show some nice stuff connected to KL divergence.

Multiplicative Weights Update

We explore the Multiplicative Weights Update (MWU) algorithm and its connections to entropy. This is a powerful algorithmic framework that appears in many areas of machine learning, game theory, and optimization.

Here's a motivating riddle:

How to get rich

Fisher Information

We dive into Fisher information and its fundamental role in statistics and information geometry. We derive it as a limit of KL divergence for close distributions.

Here's a motivating riddle:

Why polling sucks

Bonus random content

More on entropy and KL divergence applications; cut from the third chapter.

⚠️
Chain rule & uniqueness
⚠️
Conditional entropy and mutual information

Bonus Kolmogorov complexity

⚠️
Uncomputability
⚠️
Clustering by relative Kolmogorov complexity
⚠️
What the hell is randomness?