Right now, millions of people are simultaneously chatting with a system that remembers nothing, knows nothing, and resets after every message. The engineering keeping that illusion […]
Join the discussion on our Discord server
After reinforcement learning agents doing great at playing Atari video games, Alpha Go, doing financial trading, dealing with language modeling, let me tell you the real story here.In this episode I want to shi...
Join the discussion on our Discord server
In this episode I explain how a research group from the University of Lubeck dominated the curse of dimensionality for the generation of large medical images with GANs. The problem is not as trivial as it seems. ...
Join the discussion on our Discord server
Training neural networks faster usually involves the usage of powerful GPUs. In this episode I explain an interesting method from a group of researchers from Google Brain, who can train neural networks faster...
Some of the most powerful NLP models like BERT and GPT-2 have one thing in common: they all use the transformer architecture. Such architecture is built on top of another important concept already known to the community: self-attention.In this episode I ...
The brutal truth about why Silicon Valley is blowing billions on glorified autocomplete while pretending it’s the next iPhone. We’re diving deep into the AI investment […]
VortexNet uses actual whirlpools to build neural networks. Seriously. By borrowing equations from fluid dynamics, this new architecture might solve deep learning’s toughest problems—from vanishing gradients […]
Our website uses cookies to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept,” you consent to use ALL the cookies.