transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported46 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Add HyperCLOVAX SEED Think 14B
- Qwen3VL/Qwen2.5VL VisionAttention breaks torch.compile with flash_attention_2
- Add HyperCLOVA X SEED Think 14B
- fix
- Fix: Add correct return behaviour when output_hidden_states=True for CLIP and SIGLIP vision models
- Whisper `return_language` with pipeline no longer working
- First-class fine-tuning support for Mamba / Mamba-2 SSMs — architecture is production-ready, but the training path in Transformers isn't
- [Bug] Catastrophic gradient explosion (NaN) in RLHF with Qwen3.5 due to 3D position_ids forcing SDPA Math fallback and BF16 collapse
- Fix tie_weights skipping logic is not tied to model thread scope
- Nonexistant import from image_utils
- Docs
- Python not yet supported