deepspeed
https://github.com/microsoft/deepspeed
Python
DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported16 Subscribers
Add a CodeTriage badge to deepspeed
Help out
- Issues
- Support configuration for general devices and backends.
- max_grad_norm is ignored in FP16 training
- ZeRO optimizer LAMB compatibility
- why deepspeed transformer only adjust init range on rank 0?
- 'CUDA error: an illegal memory access was encountered' in forward
- w/o model-parallel usability numbers reproduce
- Warning: NaN or Inf found in input tensor when running DeepSpeedExamples/BingBertSquad.
- Using DeepSpeed with the latest version of Megatron-ML
- Following the Bert-finetuning tutorial results in `ImportError` or `IsADirectoryError:` for run_squad_baseline.sh
- How to reproduce BERT perf results in deepspeed blog
- Docs
- Python not yet supported