transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported46 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Let's CI go great
- [Gemma 4] Support per-layer FlashAttention: FA2 for sliding layers, SDPA for global layers
- Add PolarQuant quantization: Hadamard-rotated Lloyd-Max optimal weights + KV cache
- Gemma4 31B-IT Multi-GPU inference CUDA OOM
- feat: make timesfm2_5 onnx export compatible
- Gemma4: chat_template missing from tokenizer_config.json, requires manual loading from separate file
- [Gemma 4] mm_token_type_ids required for text-only fine-tuning - should default to zeros
- Proposal: Agent-first CLI
- DO NOT MERGE - model creation skill
- Fix gemma4 has flash-attention incompatbile head-dim=512
- Docs
- Python not yet supported