transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported46 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Support for sequence-level custom metrics with decoder-only models
- Fix: Handling fused qkv result tensor slicing for tp sharded qkv weights
- [`FA`] Refactor FA CB kwargs
- [MPS] Upstream correctness issue in attention when value head dim differs from query
- Add SarvamMLA model (sarvamai/sarvam-105b)
- Improve clarity and grammar in Auto Classes documentation
- Add /v1/completions endpoint (OpenAI legacy completions API) to `transformers serve`
- Fix position_ids docstring in modeling_flash_attention_utils.py
- Fix crash in Qwen2_5_VLProcessor when using batched input with padding=False
- apply_chat_template returns all-zero assistant_masks for multimodal inputs
- Docs
- Python not yet supported