https://towardsdatascience.com/multi-class-classification-with-transformers-6cf7b59a033a
Transformers have been described as the fourth pillar of deep learning, alongside the likes of convolutional and recurrent neural networks.
However, from the perspective of natural language processing — transformers are much more than that. Since their introduction in 2017, they’ve come to dominate the majority of NLP benchmarks — and continue to impress daily.
The thing is, transformers are damn cool. And with libraries like HuggingFace’s transformers — it has become too easy to build incredible solutions with them.
So, what’s not to love? Incredible performance paired with the ultimate ease-of-use.
In this article, we’ll work through building a multi-class classification model using transformers — from start-to-finish.
Top comments (0)