Hugging Face(@huggingface)さんの人気ツイート(リツイート順)

26
🔥We're launching the new huggingface.co and it's incredible 🚀Play live with +10 billion parameters models, deploy them instantly in production with our hosted API, join the 500 organizations using our hub to host/share models & datasets And one more thing... 👇
27
TODAY'S A BIG DAY Spaces are now publicly available Build, host, and share your ML apps on @huggingface in just a few minutes. There's no limit to what you can build. Be creative, and share what you make with the community. 🙏 @streamlit and @Gradio hf.co/spaces/launch
28
Transformers v3.1.0 is out, first pypi release with 💫 PEGASUS, DPR, mBART 💫 📖 New & simpler docs and tutorials 🎤 Dialogue & zero-shot pipelines ⭐️ New encoder-decoder architectures: Bert2GPT2, Roberta2Roberta, Longformer2Roberta, ... 📕 Named outputs:
29
We've heard your requests! Over the past few months ... we've been working on a Hugging Face Course! The release is imminent. Sign-up for the newsletter to know when it comes out: huggingface.curated.co Sneak peek; Transfer Learning with @GuggerSylvain:youtube.com/watch?v=BqqfQn…
30
🔥Transformers' first-ever end-2-end multimodal demo was just released, leveraging LXMERT, SOTA model for visual Q&A! Model by @HaoTan5, @mohitban47, with an impressive implementation in Transformers by @avalmendoz (@uncnlp) Notebook available here: colab.research.google.com/drive/18TyuMfZ… 🤗
31
🖌️ Stable Diffusion meets 🧨Diffusers! Releasing diffusers==0.2.2 with full support of @StabilityAI's Stable Diffusion & schedulers 🔥 Google colab: 👉 colab.research.google.com/github/hugging… Code snippet 👇
32
The new SOTA is in Transformers! DeBERTa-v2 beats the human baseline on SuperGLUE and up to a crazy 91.7% dev accuracy on MNLI task. Beats T5 while 10x smaller! DeBERTa-v2 contributed by @Pengcheng2020 from @MSFTResearch Try it directly on the hub: huggingface.co/microsoft/debe…
33
🔥JAX meets Transformers🔥 @GoogleAI's JAX/Flax library can now be used as Transformers' backbone ML library. JAX/Flax makes distributed training on TPU effortless and highly efficient! 👉 Google Colab: colab.research.google.com/github/hugging… 👉 Runtime evaluation: github.com/huggingface/tr…
34
💫 Perceiver IO by @DeepMind is now available in 🤗 Transformers! A general purpose deep learning model that works on any modality and combinations thereof 📜text 🖼️ images 🎥 video 🔊 audio ☁️ point clouds ... Read more in our blog post: huggingface.co/blog/perceiver
35
Our API now includes a brand new pipeline: zero-shot text classification This feature lets you classify sequences into the specified class names out-of-the-box w/o any additional training in a few lines of code! 🚀 Try it out (and share screenshots 📷): huggingface.co/facebook/bart-…
36
🤗Transformers v4.7.0 was just released with 🖼️DETR by @FacebookAI! DETR is an Object Detection model that can take models from timm by @wightmanr as a backbone. Contributed by @NielsRogge, try it out: colab.research.google.com/github/NielsRo… v4.7.0 launches with support for PyTorch v1.9.0!
37
🚨It's time for another community event on July 7th-July 14th🚨 We've partnered up with @GoogleAI and @googlecloud to teach you how to use JAX/Flax for NLP & CV🚀 You define your project - we give you guidance, scripts & free TPU VMs 🤗 To participate: discuss.huggingface.co/t/open-to-the-…
38
Transformers v4.22 is out, and includes the first VIDEO models! 🎥 💥VideoMAE: masked auto-encoders for video 💥X-CLIP: CLIP for video-language Other nice goodies: 💥Swin Transformer v2 💥Pegasus-X 💥Donut 💥MobileViT ... and MacOS support (device="mps")!