transformers
https://github.com/huggingface/transformers
Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported48 Subscribers
View all SubscribersAdd a CodeTriage badge to transformers
Help out
- Issues
- Add ability to specify input device for ffmpeg_microphone()
- Checkpoint saving by different evaluation criterias
- Add basic eval table logging for WandbCallback
- feat: adding mplugdocowl
- Trainer should throw a warning if max_sequence_length < number of tokens in dataset sample record.
- Add Nomic Embed Code to Transformers
- Training GPT2 with run_clm.py exceeds the described memory amount .
- [LLaMA3] 'add_bos_token=True, add_eos_token=True' seems not taking effect
- Add IRIS
- Add IRIS
- Docs
- Python not yet supported