Skip to main content

Accelerating NLP with Transformers: AWS & Hugging Face

Are you building Natural Language Processing (NLP) ML workloads and early in your journey using Transformers? Are you looking for ways to explore Transformers but unsure about the learning curve and ROI? Are you a startup technology leader looking to create world-leading NLP performance?

If you answered "yes" to any of the questions above, this workshop is for you!

AWS has partnered with Hugging Face to make it easier for companies to leverage state-of-the-art ML models and ship cutting-edge NLP features faster.

Learn how the new Hugging Face Deep Learning Containers make it easier than ever to train Transformer models on Amazon SageMaker. Get started with (literally) a few lines of code, using variants specially optimized for TensorFlow and PyTorch and variety of infrastructure options.