text-generation-inference
https://github.com/huggingface/text-generation-inference
Python
Large Language Model Text Generation Inference
Triage Issues!
When you volunteer to triage issues, you'll receive an email each day with a link to an open issue that needs help in this project. You'll also receive instructions on how to triage issues.
Triage Docs!
Receive a documented method or class from your favorite GitHub repos in your inbox every day. If you're really pro, receive undocumented methods or classes and supercharge your commit history.
Python not yet supported1 Subscribers
Add a CodeTriage badge to text-generation-inference
Help out
- Issues
- Cant install on Ubuntu 22.04 with Cuda 11.8
- Response prefill logprobs seems to become incorrect when using `AsyncInferenceClient` in some circumstances
- Add support for Idefics 3
- Error while building TGI from source
- A seeming typo in `text_generation_server/utils/adapters.py`
- Feature Request: Support for LLaMa 3.1 built-in tools
- Watermarking cannot be detected
- Quantization Failure with Bitsandbytes on SageMaker TGI Deployment: Compatibility Issue?
- Failing to unpickle the model
- Could not import SGMV kernel from Punica, falling back to loop.
- Docs
- Python not yet supported