transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported48 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Only Fine-tune the embeddings of the added special tokens
- adjust beam search early stopping to any criteria as opposed to all
- Replace all torch.FloatTensor by torch.Tensor
- switch from `training_args.bin` `training_args.json`
- Add embedding scaling
- No use `no_sync` context manager when using gradient accumulation w/ deepspeed's zero stage 2 or 3 via `accelerate`
- Fixes bug in image_transforms.py
- How to Log Training Loss at Step Zero in Hugging Face Trainer or SFT Trainer?
- Optimized dola decoding generation function for faster performance
- Docs updates (russian documentation)
- Docs
- Python not yet supported