podcast

November 21, 2017

Episode 30: Neural networks and genetic evolution: an unfeasible approach

Despite what researchers claim about genetic evolution, in this episode we give a realistic view of the field.
November 11, 2017

Episode 29: Fail your AI company in 9 steps

In order to succeed with artificial intelligence, it is better to know how to fail first. It is easier than you think.Here are 9 easy steps to fail your AI startup.
November 4, 2017

Episode 28: Towards Artificial General Intelligence: preliminary talk

The enthusiasm for artificial intelligence is raising some concerns especially with respect to some ventured conclusions about what AI can really do and what its direct descendent, artificial general intelligence would be capable of doing in the immediat...
October 30, 2017

Episode 27: Techstars accelerator and the culture of fireflies

In the aftermath of the Barclays Accelerator, powered by Techstars experience, one of the most innovative and influential startup accelerators in the world, I’d like to give back to the community lessons learned, including the need for confidence, soft-s...
October 23, 2017

Episode 26: Deep Learning and Alzheimer

In this episode I speak about Deep Learning technology applied to Alzheimer disorder prediction. I had a great chat with Saman Sarraf, machine learning engineer at Konica Minolta, former lab manager at the Rotman Research Institute at Baycrest, Universit...
October 16, 2017

Episode 25: How to become data scientist [RB]

In this episode, I speak about the requirements and the skills to become data scientist and join an amazing community that is changing the world with data analyticsa
October 9, 2017

Episode 24: How to handle imbalanced datasets

In machine learning and data science in general it is very common to deal at some point with imbalanced datasets and class distributions. This is the typical case where the number of observations that belong to one class is significantly lower than those...
October 3, 2017

Episode 23: Why do ensemble methods work?

Ensemble methods have been designed to improve the performance of the single model, when the single model is not very accurate. According to the general definition of ensembling, it consists in building a number of single classifiers and then combining o...
September 25, 2017

Episode 22: Parallelising and distributing Deep Learning

Continuing the discussion of the last two episodes, there is one more aspect of deep learning that I would love to consider and therefore left as a full episode, that is parallelising and distributing deep learning on relatively large clusters. As a matt...
gdpr-image
Our website uses cookies to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept,” you consent to use ALL the cookies.
Read more