This website requires JavaScript.
Explore
Help
Sign In
tqcq
/
llama.cpp
Watch
0
Star
0
Fork
0
You've already forked llama.cpp
mirror of
https://github.com/ggml-org/llama.cpp.git
synced
2025-09-07 15:40:19 -04:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
213d1439fadefe182f69c5f7e8dd3b4b6572ebcb
llama.cpp
/
.github
/
workflows
History
Eve
1cfb5372cf
Fix broken Vulkan Cmake (properly) (
#5230
)
...
* build vulkan as object * vulkan ci
2024-01-31 20:21:55 +01:00
..
build.yml
Fix broken Vulkan Cmake (properly) (
#5230
)
2024-01-31 20:21:55 +01:00
code-coverage.yml
…
docker.yml
docker : add server-first container images (
#5157
)
2024-01-28 09:55:31 +02:00
editorconfig.yml
support SYCL backend windows build (
#5208
)
2024-01-31 08:08:07 +05:30
gguf-publish.yml
…
nix-ci-aarch64.yml
workflows: nix-build-aarch64: rate limit
2024-01-22 12:19:30 +00:00
nix-ci.yml
workflows: nix-ci: drop the redundant "paths" filter
2024-01-22 12:19:30 +00:00
nix-flake-update.yml
ci: nix-flake-update: new token with pr permissions (
#4879
)
2024-01-11 17:22:34 +00:00
nix-publish-flake.yml
workflows: nix-flakestry: drop tag filters
2023-12-31 13:14:58 -08:00
python-check-requirements.yml
python : add check-requirements.sh and GitHub workflow (
#4585
)
2023-12-29 16:50:29 +02:00
python-lint.yml
…
tidy-post.yml
…
tidy-review.yml
…
zig-build.yml
…