transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported46 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Fix some missing / incorrect entries in auto files
- Avoid floating point math for ceil operations
- [ColQwen2] Refactor output tracing (issue #43979)
- `PixioPatchEmbeddings.forward` supports `interpolate_pos_encoding` but it is not propagated through `PixioEmbeddings`/`PixioModel`
- Support packed sequences for linear attention models (i.e. Qwen3.5)
- Allow kernel modules to declare their preferred mask function
- fix(gpt2): Resolve NaN/Inf issue in lm_head on Python 3.13 with tied weights
- fix: torch_float should return float, not int
- transformers serve + llamacpp
- Allow kernel modules to declare their preferred mask function
- Docs
- Python not yet supported