Logo
Explore Help
Sign In
tqcq/llama.cpp
0
0
Fork 0
You've already forked llama.cpp
mirror of https://github.com/ggml-org/llama.cpp.git synced 2025-08-20 06:36:48 -04:00
Code Issues Packages Projects Releases Wiki Activity
Files
5eaf9964fc797d4585c214db32a463d557f3ed33
llama.cpp/.github/workflows
History
Xuan Son Nguyen 2bed4aa3f3 devops : add intel oneapi dockerfile (#5068)
Co-authored-by: Xuan Son Nguyen <xuanson.nguyen@snowpack.eu>
2024-01-23 09:11:39 +02:00
..
build.yml
ci : fix Windows CI by updating Intel SDE version (#5053)
2024-01-22 10:55:05 +02:00
code-coverage.yml
…
docker.yml
devops : add intel oneapi dockerfile (#5068)
2024-01-23 09:11:39 +02:00
editorconfig.yml
…
gguf-publish.yml
…
nix-ci-aarch64.yml
workflows: nix-build-aarch64: rate limit
2024-01-22 12:19:30 +00:00
nix-ci.yml
workflows: nix-ci: drop the redundant "paths" filter
2024-01-22 12:19:30 +00:00
nix-flake-update.yml
ci: nix-flake-update: new token with pr permissions (#4879)
2024-01-11 17:22:34 +00:00
nix-publish-flake.yml
workflows: nix-flakestry: drop tag filters
2023-12-31 13:14:58 -08:00
python-check-requirements.yml
python : add check-requirements.sh and GitHub workflow (#4585)
2023-12-29 16:50:29 +02:00
python-lint.yml
ci : add flake8 to github actions (python linting) (#4129)
2023-11-20 11:35:47 +01:00
tidy-post.yml
…
tidy-review.yml
…
zig-build.yml
ci : add Zig CI/CD and fix build (#2996)
2023-10-08 16:59:20 +03:00
Powered by Gitea Version: 1.24.5 Page: 4592ms Template: 100ms
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API