transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported46 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Fallback to kernels-community/flash-attn2 is blocked by other checks when fa2 is not installed
- from_pretrained orchestration + distributed save/load
- CLIPTextModel / CLIPVisionModel fail to load old checkpoints after architecture flattening
- Add example for iterative chatting with MLLMs
- fix(clipseg): auto-fix failing tests
- transformers serve crashes with AttributeError: 'Gemma4Processor' object has no attribute '_tokenizer'
- MoE expert parallelism + sequence parallelism
- fix(altclip): auto-fix failing tests
- Fix flash_attention_3 detection and import for hopper wheel installs
- Make Gemma4ClippableLinear inherit from nn.Linear for PEFT/LoRA compatibility
- Docs
- Python not yet supported