Hugging Face(@huggingface)さんの人気ツイート(いいね順)

26
🖌️ Stable Diffusion meets 🧨Diffusers! Releasing diffusers==0.2.2 with full support of @StabilityAI's Stable Diffusion & schedulers 🔥 Google colab: 👉 colab.research.google.com/github/hugging… Code snippet 👇
27
Transformers v4.22 is out, and includes the first VIDEO models! 🎥 💥VideoMAE: masked auto-encoders for video 💥X-CLIP: CLIP for video-language Other nice goodies: 💥Swin Transformer v2 💥Pegasus-X 💥Donut 💥MobileViT ... and MacOS support (device="mps")!
28
The new SOTA is in Transformers! DeBERTa-v2 beats the human baseline on SuperGLUE and up to a crazy 91.7% dev accuracy on MNLI task. Beats T5 while 10x smaller! DeBERTa-v2 contributed by @Pengcheng2020 from @MSFTResearch Try it directly on the hub: huggingface.co/microsoft/debe…
29
We've heard your requests! Over the past few months ... we've been working on a Hugging Face Course! The release is imminent. Sign-up for the newsletter to know when it comes out: huggingface.curated.co Sneak peek; Transfer Learning with @GuggerSylvain:youtube.com/watch?v=BqqfQn…
30
🔥JAX meets Transformers🔥 @GoogleAI's JAX/Flax library can now be used as Transformers' backbone ML library. JAX/Flax makes distributed training on TPU effortless and highly efficient! 👉 Google Colab: colab.research.google.com/github/hugging… 👉 Runtime evaluation: github.com/huggingface/tr…
31
GPT-Neo, the #OpenSource cousin of GPT3, can do practically anything in #NLP from sentiment analysis to writing SQL queries: just tell it what to do, in your own words. 🤯 How does it work? 🧐 Want to try it out? 🎮 👉 huggingface.co/blog/few-shot-…
32
🔥Transformers' first-ever end-2-end multimodal demo was just released, leveraging LXMERT, SOTA model for visual Q&A! Model by @HaoTan5, @mohitban47, with an impressive implementation in Transformers by @avalmendoz (@uncnlp) Notebook available here: colab.research.google.com/drive/18TyuMfZ… 🤗
33
TODAY'S A BIG DAY Spaces are now publicly available Build, host, and share your ML apps on @huggingface in just a few minutes. There's no limit to what you can build. Be creative, and share what you make with the community. 🙏 @streamlit and @Gradio hf.co/spaces/launch
34
Transformers v3.1.0 is out, first pypi release with 💫 PEGASUS, DPR, mBART 💫 📖 New & simpler docs and tutorials 🎤 Dialogue & zero-shot pipelines ⭐️ New encoder-decoder architectures: Bert2GPT2, Roberta2Roberta, Longformer2Roberta, ... 📕 Named outputs:
35
Last week, @MetaAI introduced NLLB-200: a massive translation model supporting 200 languages. Models are now available through the Hugging Face Hub, using 🤗Transformers' main branch. Models on the Hub: huggingface.co/facebook/nllb-… Learn about NLLB-200: ai.facebook.com/research/no-la…
36
Our API now includes a brand new pipeline: zero-shot text classification This feature lets you classify sequences into the specified class names out-of-the-box w/o any additional training in a few lines of code! 🚀 Try it out (and share screenshots 📷): huggingface.co/facebook/bart-…
37
🤗Transformers v4.7.0 was just released with 🖼️DETR by @FacebookAI! DETR is an Object Detection model that can take models from timm by @wightmanr as a backbone. Contributed by @NielsRogge, try it out: colab.research.google.com/github/NielsRo… v4.7.0 launches with support for PyTorch v1.9.0!
38
🚨It's time for another community event on July 7th-July 14th🚨 We've partnered up with @GoogleAI and @googlecloud to teach you how to use JAX/Flax for NLP & CV🚀 You define your project - we give you guidance, scripts & free TPU VMs 🤗 To participate: discuss.huggingface.co/t/open-to-the-…