It all starts from physics. 4 best use cases of entropy
The entropy of an isolated system never decreases.
Everyone at school, at some point of their life, learned this in their physics class. In fact, that’s the second law of thermodynamics, which is the only low of physics that requires a particular direction for time, also called arrow of time. What does this have to do with machine learning?
In this episode I provide you with the 4 best use cases of entropy.
An example from Physics
Despite being an abstract concept, everyone has an intuitive sense of the effects of entropy and the arrow of time. For instance, imagine watching a video depicting a glass that falls from a table and breaks in many pieces.
Played in reverse, it would show all the pieces that reassemble themselves back into a glass from the ground to the top of the table. Above all, we intuitively identify that the video makes sense if we play it forward, matching our expectations. In other words, that is when the entropy of the scene is increasing.
However, imagine to fill a tank with two different types of gas, say gas A and gas B. Because of the second law of thermodynamics, both gases reach an equilibrium after which they are so mixed up that it is difficult if not impossible to distinguish each molecule from each type of gas.
On the contrary, in a low entropy tank, particles of gas A would be gathered in a specific area, being perfectly distinguishable from the particles of gas B.
To be more specific, entropy is a measure of the disorder of a physical system. Hence, it is related to the uncertainty associated with its physical state.
Entropy in machine learning
A similar concept of uncertainty and lack of knowledge can be applied to machine learning. In fact, entropy can also be interpreted as a measure of the expected amount of information.
Such a concept plays a crucial role in machine learning. Researchers refer to entropy in machine learning as to Shannon entropy.
In this episode we provide you with the 4 best use cases of entropy in machine learning:
- exploratory data analysis
- feature selection
- subset extraction
- modeling
To find out more, listen to the show.
References
Entropy in machine learning https://amethix.com/entropy-in-machine-learning/