Hugging Face

+ Gradient

How to use Gradient and

Hugging Face

together

Get started with Hugging Face on Gradient

Hugging Face is setting a new standard for open source natural language processing models. The most popular Hugging Face library, Transformers, is at the time of writing approaching 50K stars on GitHub and is one of the fastest growing open source projects in machine learning. Transformers draws on 32+ pretrained models in 100+ languages and offers 8 architectures including GPT-2, BERT, RoBERTa, XLM, DistilliBERT, and more.

Up and running with Transformers from Hugging Face

After you make a Paperspace account you'll need to head to the console and select Notebooks > Create.

You should now see the Create a notebook flow. From the Recommended runtimes menu you should see a tile for Hugging Face called Transformers + NLP. Select this tile.

From the create a notebook screen you can select a number of pre-installed runtimes including Hugging Face Transformers

Next, select a machine or instance to run your notebook.

Here we select a Free-GPU instance as well as a 6-hour auto-shutdown interval.

After this we can start our notebook and begin exploring!

Bonus: Webinar with the Hugging Face team!