This website requires JavaScript.
Explore
Help
Sign In
tqcq
/
llama.cpp
Watch
0
Star
0
Fork
0
You've already forked llama.cpp
mirror of
https://github.com/ggml-org/llama.cpp.git
synced
2025-08-15 20:53:00 -04:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
b0ec5218c3d24755786b80ecce9cf4ffc07583f8
llama.cpp
/
prompts
History
Georgi Gerganov
0c731ca403
prompts : fix editorconfig checks after
#3416
2023-10-06 16:36:32 +03:00
..
alpaca.txt
…
chat-with-baichuan.txt
feature : support Baichuan serial models (
#3009
)
2023-09-14 12:32:10 -04:00
chat-with-bob.txt
…
chat-with-vicuna-v0.txt
…
chat-with-vicuna-v1.txt
…
chat.txt
…
dan-modified.txt
prompts : model agnostic DAN (
#1304
)
2023-05-11 18:10:19 +03:00
dan.txt
prompts : model agnostic DAN (
#1304
)
2023-05-11 18:10:19 +03:00
LLM-questions.txt
parallel : add option to load external prompt file (
#3416
)
2023-10-06 16:16:38 +03:00
parallel-questions.txt
prompts : fix editorconfig checks after
#3416
2023-10-06 16:36:32 +03:00
reason-act.txt
…