bitsandbytes0.45.1
Published
k-bit optimizers and matrix multiplication routines.
pip install bitsandbytes
Package Downloads
Authors
Requires Python
>=3.8
Dependencies
- torch
~=2.0
- numpy
>=1.17
- pandas
; extra == "benchmark"
- matplotlib
; extra == "benchmark"
- hf-doc-builder
==0.5.0; extra == "docs"
- bitsandbytes
[test]; extra == "dev"
- build
<2,>=1.0.0; extra == "dev"
- ruff
==0.6.9; extra == "dev"
- pre-commit
<4,>=3.5.0; extra == "dev"
- wheel
<1,>=0.42; extra == "dev"
- einops
~=0.8.0; extra == "test"
- lion-pytorch
==0.2.3; extra == "test"
- pytest
~=8.3; extra == "test"
- scipy
<2,>=1.10.1; python_version < "3.9" and extra == "test"
- scipy
<2,>=1.11.4; python_version >= "3.9" and extra == "test"
- transformers
<5,>=4.30.1; extra == "test"
bitsandbytes
The bitsandbytes
library is a lightweight Python wrapper around CUDA custom functions, in particular 8-bit optimizers, matrix multiplication (LLM.int8()), and 8 & 4-bit quantization functions.
The library includes quantization primitives for 8-bit & 4-bit operations, through bitsandbytes.nn.Linear8bitLt
and bitsandbytes.nn.Linear4bit
and 8-bit optimizers through bitsandbytes.optim
module.
There are ongoing efforts to support further hardware backends, i.e. Intel CPU + GPU, AMD GPU, Apple Silicon. Windows support is quite far along and is on its way as well.
Please head to the official documentation page:
https://huggingface.co/docs/bitsandbytes/main
bitsandbytes
multi-backend alpha release is out!
🚀 Big news! After months of hard work and incredible community contributions, we're thrilled to announce the bitsandbytes multi-backend alpha release! 💥
Now supporting:
- 🔥 AMD GPUs (ROCm)
- ⚡ Intel CPUs & GPUs
We’d love your early feedback! 🙏
👉 Instructions for your pip install
here
We're super excited about these recent developments and grateful for any constructive input or support that you can give to help us make this a reality (e.g. helping us with the upcoming Apple Silicon backend or reporting bugs). BNB is a community project and we're excited for your collaboration 🤗
License
bitsandbytes
is MIT licensed.
We thank Fabio Cannizzo for his work on FastBinarySearch which we use for CPU quantization.