This website requires JavaScript.
Explore
Help
Sign In
tqcq
/
llama.cpp
Watch
0
Star
0
Fork
0
You've already forked llama.cpp
mirror of
https://github.com/ggml-org/llama.cpp.git
synced
2025-08-01 15:09:32 -04:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
b1350
llama.cpp
/
prompts
History
Georgi Gerganov
0c731ca403
prompts : fix editorconfig checks after
#3416
2023-10-06 16:36:32 +03:00
..
alpaca.txt
…
chat-with-baichuan.txt
feature : support Baichuan serial models (
#3009
)
2023-09-14 12:32:10 -04:00
chat-with-bob.txt
…
chat-with-vicuna-v0.txt
examples : read chat prompts from a template file (
#1196
)
2023-05-03 20:58:11 +03:00
chat-with-vicuna-v1.txt
examples : read chat prompts from a template file (
#1196
)
2023-05-03 20:58:11 +03:00
chat.txt
examples : read chat prompts from a template file (
#1196
)
2023-05-03 20:58:11 +03:00
dan-modified.txt
prompts : model agnostic DAN (
#1304
)
2023-05-11 18:10:19 +03:00
dan.txt
prompts : model agnostic DAN (
#1304
)
2023-05-11 18:10:19 +03:00
LLM-questions.txt
parallel : add option to load external prompt file (
#3416
)
2023-10-06 16:16:38 +03:00
parallel-questions.txt
prompts : fix editorconfig checks after
#3416
2023-10-06 16:36:32 +03:00
reason-act.txt
…