This website requires JavaScript.
Explore
Help
Sign In
tqcq
/
llama.cpp
Watch
0
Star
0
Fork
0
You've already forked llama.cpp
mirror of
https://github.com/ggml-org/llama.cpp.git
synced
2025-06-30 12:55:17 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
213701b51a17175d0d326b566efc03f30ec7fbe6
llama.cpp
/
.devops
/
nix
History
Michael Francis
3840b6f593
nix : enable curl (
#8043
)
...
Co-authored-by: Georgi Gerganov <
ggerganov@gmail.com
>
2024-07-01 14:47:04 +03:00
..
apps.nix
build
: rename main → llama-cli, server → llama-server, llava-cli → llama-llava-cli, etc... (
#7809
)
2024-06-13 00:41:52 +01:00
devshells.nix
flake.nix : rewrite (
#4605
)
2023-12-29 16:42:26 +02:00
docker.nix
nix: init singularity and docker images (
#5056
)
2024-02-22 11:44:10 -08:00
jetson-support.nix
flake.nix: expose full scope in legacyPackages
2023-12-31 13:14:58 -08:00
nixpkgs-instances.nix
nix: add a comment on the many nixpkgs-with-cuda instances
2024-01-22 12:19:30 +00:00
package.nix
nix : enable curl (
#8043
)
2024-07-01 14:47:04 +03:00
scope.nix
nix: init singularity and docker images (
#5056
)
2024-02-22 11:44:10 -08:00
sif.nix
build(nix): Introduce flake.formatter for
nix fmt
(
#5687
)
2024-03-01 15:18:26 -08:00