transformers
https://github.com/huggingface/transformers
Python
π€ Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported46 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Fix ProphetNet forward to handle tuple encoder_outputs
- TypeError: Received a NoneType for argument video_processor, but a BaseVideoProcessor was expected.(this issue im getting when using doc-ocr)
- Optimize LlamaAttention by fusing QKV projections
- Image Embedding Models (Feature extractors) should have a `.hidden_size`
- Refactor benchmark utils: add type hints, GPU metrics helper, and conβ¦
- Support encoder text classification for sequence to sequence models like BART and T5
- π [i18n-KO] Updated `perf_train_gpu_many.md`
- Add Mistral tokenizer missing methods
- Add Timestamp Support for Voxtral Models
- Document the /v1/models endpoint
- Docs
- Python not yet supported