transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported48 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- removed redundant creation of causal mask when attention mask is already 4D
- Fix `inputs` deprecation warning in automatic_speech_recognition.py
- add `private` parameter to trainer.push_to_hub
- Clarify what kwargs can be accepted by "AutoModelForCausalLM.from_pretrained()"
- Support context parallel training with ring-flash-attention
- The examples in the examples directory are mostly for fine-tuning pre-trained models?how to trian from scratch
- Support to use adam_mini from installing directly
- fix flash attention comment
- A Trainer subclass for Decoder-Only LM with generation in evaluate()
- Adding pruning integration to transformer
- Docs
- Python not yet supported