1
No labeled data? No problem.
The 🤗 Transformers master branch now includes a built-in pipeline for zero-shot text classification, to be included in the next release.
Try it out in the notebook here: colab.research.google.com/drive/1jocViLo…
2
3
Want speedy transformers models w/o a GPU?! 🧐
Starting with transformers v3.1.0 your models can now run at the speed of light on commodity CPUs thanks to ONNX Runtime quantization!🚀. Check out our 2nd blog post with ONNX Runtime on the subject! 🔥
medium.com/microsoftazure…
4
🔥Transformers' first-ever end-2-end multimodal demo was just released, leveraging LXMERT, SOTA model for visual Q&A!
Model by @HaoTan5, @mohitban47, with an impressive implementation in Transformers by @avalmendoz (@uncnlp)
Notebook available here: colab.research.google.com/drive/18TyuMfZ… 🤗
5
The ultimate guide to encoder-decoder models!
Today, we're releasing part one explaining how they work and why they have become indispensable for NLG tasks such as summarization and translation.
> colab.research.google.com/drive/18ZBlS4t…
Subscribe for the full series: huggingface.curated.co
6
Our API now includes a brand new pipeline: zero-shot text classification
This feature lets you classify sequences into the specified class names out-of-the-box w/o any additional training in a few lines of code! 🚀
Try it out (and share screenshots 📷): huggingface.co/facebook/bart-…
7
We are honored to be awarded the Best Demo Paper for "Transformers: State-of-the-Art Natural Language Processing" at #emnlp2020 😍
Thank you to our wonderful team members and the fantastic community of contributors who make the library possible 🤗🤗🤗
aclweb.org/anthology/2020…
8
Release alert: the 🤗datasets library v1.2 is available now!
With:
- 611 datasets you can download in one line of python
- 467 languages covered, 99 with at least 10 datasets
- efficient pre-processing to free you from memory constraints
Try it out at:
github.com/huggingface/da…
9
Fine-tuning a *3-billion* parameter model on a single GPU?
Now possible in transformers, thanks to the DeepSpeed/Fairscale integrations!
Thank you @StasBekman for the seamless integration, and thanks to @Microsoft and @FacebookAI teams for their support!
huggingface.co/blog/zero-deep…
10
🔥We're launching the new huggingface.co and it's incredible
🚀Play live with +10 billion parameters models, deploy them instantly in production with our hosted API, join the 500 organizations using our hub to host/share models & datasets
And one more thing... 👇
11
🚨Transformers is expanding to Speech!🚨
🤗Transformers v4.3.0 is out and we are excited to welcome @FacebookAI's Wav2Vec2 as the first Automatic Speech Recognition model to our library!
👉Now, you can transcribe your audio files directly on the hub: huggingface.co/facebook/wav2v…
12
The new SOTA is in Transformers! DeBERTa-v2 beats the human baseline on SuperGLUE and up to a crazy 91.7% dev accuracy on MNLI task.
Beats T5 while 10x smaller!
DeBERTa-v2 contributed by @Pengcheng2020 from @MSFTResearch
Try it directly on the hub: huggingface.co/microsoft/debe…
13
$40M series B! 🙏Thank you open source contributors, pull requesters, issue openers, notebook creators, model architects, twitting supporters & community members all over the 🌎!
We couldn't do what we do & be where we are - in a field dominated by big tech - without you!
14
🔥Fine-Tuning @FacebookAI's Wav2Vec2 for Speech Recognition is now possible in Transformers🔥
Not only for English but for 53 Languages🤯
Check out the tutorials:
👉 Train Wav2Vec2 on TIMIT huggingface.co/blog/fine-tune…
👉 Train XLSR-Wav2Vec2 on Common Voice
huggingface.co/blog/fine-tune…
15
Last week, EleutherAI released two checkpoints for GPT Neo, an *Open Source* replication of OpenAI's GPT-3
These checkpoints, of sizes 1.3B and 2.7B are now available in🤗Transformers!
The generation capabilities are truly🤯, try it now on the Hub: huggingface.co/EleutherAI/gpt…
16
🤗 Transformers meets VISION 📸🖼️
v4.6.0 is the first CV dedicated release!
- CLIP @OpenAI, Image-Text similarity or Zero-Shot Image classification
- ViT @GoogleAI, and
- DeiT @FacebookAI, SOTA Image Classification
Try ViT/DeiT on the hub (Mobile too!):
huggingface.co/google/vit-bas…
17
🔥JAX meets Transformers🔥
@GoogleAI's JAX/Flax library can now be used as Transformers' backbone ML library.
JAX/Flax makes distributed training on TPU effortless and highly efficient!
👉 Google Colab: colab.research.google.com/github/hugging…
👉 Runtime evaluation:
github.com/huggingface/tr…
18
GPT-Neo, the #OpenSource cousin of GPT3, can do practically anything in #NLP from sentiment analysis to writing SQL queries: just tell it what to do, in your own words. 🤯
How does it work? 🧐
Want to try it out? 🎮
👉 huggingface.co/blog/few-shot-…
19
We've heard your requests! Over the past few months ... we've been working on a Hugging Face Course!
The release is imminent. Sign-up for the newsletter to know when it comes out: huggingface.curated.co
Sneak peek; Transfer Learning with @GuggerSylvain:youtube.com/watch?v=BqqfQn…
20
The first part of the Hugging Face Course is finally out!
Come learn how the 🤗 Ecosystem works 🥳: Transformers, Tokenizers, Datasets, Accelerate, the Model Hub!
Share with your friends who want to learn NLP, it's free!
Come join us at hf.co/course
21
🤗Transformers v4.7.0 was just released with 🖼️DETR by @FacebookAI!
DETR is an Object Detection model that can take models from timm by @wightmanr as a backbone.
Contributed by @NielsRogge, try it out: colab.research.google.com/github/NielsRo…
v4.7.0 launches with support for PyTorch v1.9.0!
22
🚨It's time for another community event on July 7th-July 14th🚨
We've partnered up with @GoogleAI and @googlecloud to teach you how to use JAX/Flax for NLP & CV🚀
You define your project - we give you guidance, scripts & free TPU VMs 🤗
To participate:
discuss.huggingface.co/t/open-to-the-…
23
Document parsing meets 🤗 Transformers!
📄#LayoutLMv2 and #LayoutXLM by @MSFTResearch are now available! 🔥
They're capable of parsing document images (like PDFs) by incorporating text, layout, and visual information, as in the @Gradio demo below ⬇️
huggingface.co/spaces/nielsr/…
24
EleutherAI's GPT-J is now in 🤗 Transformers: a 6 billion, autoregressive model with crazy generative capabilities!
It shows impressive results in:
- 🧮Arithmetics
- ⌨️Code writing
- 👀NLU
- 📜Paper writing
- ...
Play with it to see how powerful it is:
huggingface.co/EleutherAI/gpt…
25
TODAY'S A BIG DAY
Spaces are now publicly available
Build, host, and share your ML apps on @huggingface in just a few minutes.
There's no limit to what you can build. Be creative, and share what you make with the community.
🙏 @streamlit and @Gradio
hf.co/spaces/launch