• More powerful deep learning with transformers (Ep. 84)

    Some of the most powerful NLP models like BERT and GPT-2 have one thing in common: they all use the transformer architecture. Such architecture is built on top of another important concept already known to the community: self-attention.In this episode I ...

Discord community chat

Join our Discord community to discuss the show, suggest new episodes and chat with other listeners!


Support us