deepspeed
https://github.com/microsoft/deepspeed
Python
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported9 Subscribers
Add a CodeTriage badge to deepspeed
Help out
- Issues
- Add HIP device abstraction, update Triton skip logic
- [REQUEST] ZeRO - introduce replicas to keep GBS from getting too large on hundres of gpus
- [BUG] Unexpected GPU memory consumption when using transformers PEFT in Zero3
- Confusing memory usage estimations
- [BUG] Error: Preparing DeepSpeed ZeRO stage 2 optimizer HIP error on AMD MI200 GPUs
- A possible solution to resolve the issue of deepspeed.initialize() hanging.
- [BUG] Error: Sizes of tensors must match except in dimension 1. Expected size 64 but got size 512 for tensor number 2 in the list.
- TEST: PR HIP-ifying and running the bias_activations kernel on AMD
- [BUG] Pipeline Parallel: Only few trainable parameters, Only GPU0 has parameters whose requires_grad is "True". Cause ValueError: optimizer got an empty parameter list when deepspeed.initialize
- [BUG]Loss decreases cyclically with epoch as the cycle
- Docs
- Python not yet supported