Compute perplexity over prompt (#270)

* Compute perplexity over prompt

* More accurate perplexity calculation - over all logits in the context window (so 512x more tokens!)

* Output all perplexitiies

* Add timing/ETA
This commit is contained in:
Gary Linscott
2023-03-21 09:27:42 -07:00
committed by GitHub
parent 3ab3e6582f
commit 486ae645fd
3 changed files with 98 additions and 13 deletions

View File

@ -40,6 +40,7 @@ struct gpt_params {
bool interactive_start = false; // reverse prompt immediately
bool instruct = false; // instruction mode (used for Alpaca models)
bool ignore_eos = false; // do not stop generating after eos
bool perplexity = false; // compute perplexity over the prompt
};
bool gpt_params_parse(int argc, char ** argv, gpt_params & params);