This website requires JavaScript.
Explore
Help
Sign In
tqcq
/
llama.cpp
Watch
0
Star
0
Fork
0
You've already forked llama.cpp
mirror of
https://github.com/ggml-org/llama.cpp.git
synced
2025-07-22 10:48:12 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
c61285e7396c8e526fe7794c19e8d4f1c99bfc51
llama.cpp
/
requirements
History
Sigbjørn Skjæret
2b131621e6
gguf-py : add support for sub_type (in arrays) in GGUFWriter add_key_value method (
#13561
)
2025-05-29 15:36:05 +02:00
..
requirements-all.txt
…
requirements-compare-llama-bench.txt
…
requirements-convert_hf_to_gguf_update.txt
common: Include torch package for s390x (
#13699
)
2025-05-22 21:31:29 +03:00
requirements-convert_hf_to_gguf.txt
common: Include torch package for s390x (
#13699
)
2025-05-22 21:31:29 +03:00
requirements-convert_legacy_llama.txt
…
requirements-convert_llama_ggml_to_gguf.txt
…
requirements-convert_lora_to_gguf.txt
common: Include torch package for s390x (
#13699
)
2025-05-22 21:31:29 +03:00
requirements-gguf_editor_gui.txt
gguf-py : add support for sub_type (in arrays) in GGUFWriter add_key_value method (
#13561
)
2025-05-29 15:36:05 +02:00
requirements-pydantic.txt
…
requirements-test-tokenizer-random.txt
…
requirements-tool_bench.txt
…