October 3, 2017

Episode 23: Why do ensemble methods work?

Ensemble methods have been designed to improve the performance of the single model, when the single model is not very accurate. According to the general definition of ensembling, it consists in building a number of single classifiers and then combining o...
September 25, 2017

Episode 22: Parallelising and distributing Deep Learning

Continuing the discussion of the last two episodes, there is one more aspect of deep learning that I would love to consider and therefore left as a full episode, that is parallelising and distributing deep learning on relatively large clusters. As a matt...
September 18, 2017

Episode 21: Additional optimisation strategies for deep learning

In the last episode How to master optimisation in deep learning I explained some of the most challenging tasks of deep learning and some methodologies and algorithms to improve the speed of convergence of a minimisation method for deep learning. I explor...
August 28, 2017

Episode 20: How to master optimisation in deep learning

The secret behind deep learning is not really a secret. It is function optimisation. What a neural network essentially does, is optimising a function. In this episode I illustrate a number of optimisation methods and explain which one is the best and why...
March 28, 2017

Episode 18: Machines that learn like humans

Artificial Intelligence allow machines to learn patterns from data. The way humans learn however is different and more efficient. With Lifelong Machine Learning, machines can learn the way human beings do, faster, and more efficiently