This website requires JavaScript.
Explore
Help
Sign In
tqcq
/
llama.cpp
Watch
0
Star
0
Fork
0
You've already forked llama.cpp
mirror of
https://github.com/ggml-org/llama.cpp.git
synced
2025-07-16 15:47:35 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
f7cab35ef9e66c57b4a416d09eb6c814e0ba4e4c
llama.cpp
/
requirements
History
compilade
d39130a398
py : use cpu-only torch in requirements.txt (
#8335
)
2024-07-07 14:23:38 +03:00
..
requirements-convert_hf_to_gguf_update.txt
py : use cpu-only torch in requirements.txt (
#8335
)
2024-07-07 14:23:38 +03:00
requirements-convert_hf_to_gguf.txt
py : use cpu-only torch in requirements.txt (
#8335
)
2024-07-07 14:23:38 +03:00
requirements-convert_legacy_llama.txt
py : switch to snake_case (
#8305
)
2024-07-05 07:53:33 +03:00
requirements-convert_llama_ggml_to_gguf.txt
py : switch to snake_case (
#8305
)
2024-07-05 07:53:33 +03:00