Files
llama.cpp/examples
slaren d894f352bf perplexity : support using multiple sequences to allow larger batch sizes (#5946)
* perplexity : support using multiple sequences to allow larger batch sizes

ggml-ci

* set cparams.n_parallel to the number of sequences

* print tested n_ctx, add assert
2024-03-09 19:55:54 +01:00
..
2023-12-21 23:08:14 +02:00
2024-02-16 11:31:07 +02:00
2024-02-16 11:31:07 +02:00
2024-02-16 11:31:07 +02:00
2024-03-09 17:34:15 +02:00
2024-02-16 11:31:07 +02:00
2024-02-16 11:31:07 +02:00
2023-03-29 20:21:09 +03:00
2023-08-30 09:29:32 +03:00
2024-03-07 11:41:53 +02:00