DEV Community

Cover image for Tokenization of text with spaCy tokenizer
Shamanth Shetty for Kern AI

Posted on

Tokenization of text with spaCy tokenizer

In the latest version of our software, the text will now automatically get tokenized with the spaCy tokenizer of your choice saving time and energy with added features exclusively for our user’s.This not only helps you to build better labelling functions via pre-integrated metadata, but also allows you to easily label data manually. Kern sparks innovation having the easiest navigation system with features that can help you manage labelling tasks rapidly in-house. This drastically reduces the time, money and support in delivering high quality AI solutions in a hassle free way.

Image description

Subscribe to our newsletter πŸ‘‰πŸΌ https://www.kern.ai/pages/open-source and stay up to date with the release so you don’t miss out on the chance to win a GeForce RTX 3090 Ti for our launch πŸ˜‰

Top comments (0)