podcast

February 3, 2026

More powerful deep learning with transformers (Ep. 84) (Rebroadcast)

Some of the most powerful NLP models like BERT and GPT-2 have one thing in common: they all use the transformer architecture. Such architecture is built on top of another important concept already known to the community: self-attention.In this episode I ...
February 3, 2026

AGI: The Dream We Should Never Reach (Ep. 296)

Also on YouTube   Two AI experts who actually love the technology explain why chasing AGI might be the worst thing for AI’s future—and why the […]
The Scientists Growing Living Computers in Swiss Labs (Ep. 292)
Our website uses cookies to give you the most relevant experience by remembering your preferences and repeat visits. By clicking “Accept,” you consent to use ALL the cookies.
Read more