transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported46 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- [`auto_docstring`] needs to be only run on __doc__
- investigate modular conversion speedups
- Fix PIL backend fallback when torchvision is unavailable
- modeling_utils unsafely accesses sys.modules[]
- Llama3 video fix
- Pass packed boundary metadata to Qwen3.5 linear-attention fast kernels from data collator
- Recent transformers versions break models using `remote_code`
- [WIP][Fix] GLM 5 set `apply_rotary_pos_emb` to `is_neox_style=False` && remove `F.relu()`
- Trainer: set skip_logits for loss-only eval when liger enabled
- Inconsistent tokenization and BLEU scores between AutoTokinzer and NllbTokenizerFast
- Docs
- Python not yet supported