This website requires JavaScript.
Explore
Help
Sign In
tqcq
/
llama.cpp
Watch
0
Star
0
Fork
0
You've already forked llama.cpp
mirror of
https://github.com/ggml-org/llama.cpp.git
synced
2025-09-03 05:39:25 -04:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
bdca38376f7e8dd928defe01ce6a16218a64b040
llama.cpp
/
prompts
History
…
..
alpaca.txt
…
assistant.txt
…
chat-with-baichuan.txt
…
chat-with-bob.txt
…
chat-with-qwen.txt
llama : add Qwen support (
#4281
)
2023-12-01 20:16:31 +02:00
chat-with-vicuna-v0.txt
…
chat-with-vicuna-v1.txt
…
chat.txt
examples : read chat prompts from a template file (
#1196
)
2023-05-03 20:58:11 +03:00
dan-modified.txt
…
dan.txt
prompts : model agnostic DAN (
#1304
)
2023-05-11 18:10:19 +03:00
LLM-questions.txt
…
mnemonics.txt
…
parallel-questions.txt
…
reason-act.txt
do not force the prompt file to end with a new line (
#908
)
2023-04-13 11:33:16 +02:00