Releases: JamePeng/llama-cpp-python
Releases · JamePeng/llama-cpp-python
v0.3.14-cu128-AVX2-win-20250801
- Compile with cuda12.8.1 for Blackwell architecture (sm_100 and sm_120) computing cards
- Sync llama.cpp API 20250801
- Remove sm_70 in cuda12.8.1 action flow
v0.3.14-cu128-AVX2-linux-20250801
- Compile with cuda12.8.1 for Blackwell architecture (sm_100 and sm_120) computing cards
- Sync llama.cpp API 20250801
- Remove sm_70 in cuda12.8.1 action flow
v0.3.14-cu126-AVX2-win-20250801
Sync llama.cpp API 20250801
v0.3.14-cu126-AVX2-linux-20250801
Sync llama.cpp API 20250801
v0.3.14-cu124-AVX2-win-20250801
Sync llama.cpp API 20250801
v0.3.14-cu124-AVX2-linux-20250801
Sync llama.cpp API 20250801
v0.3.13-cu126-AVX2-win-20250717
fix memory_seq_rm crash bug
v0.3.13-cu126-AVX2-linux-20250717
fix memory_seq_rm crash bug
v0.3.13-cu124-AVX2-win-20250717
fix memory_seq_rm crash bug
v0.3.13-cu124-AVX2-linux-20250717
fix memory_seq_rm crash bug