Transformer models deliver state-of-the-art performance on a wide range of machine learning tasks, such as natural language processing, computer vision, speech, and more. However, training them at scale often requires a large amount of computing power, making the whole process unnecessarily long, complex, and costly. Join us for a live webinar to learn how the Hugging Face and Habana Labs joint solution makes it easier and quicker to train high-quality transformer models.
Watch webinar recording:
- Habana Gaudi training solutions
- Hugging Face transformer models powered by deep learning
- Live demo
- Discussion on the hardware and software accelerating Transformer training
- Q&A