intuition - What is perplexity? - Cross Validated I came across term perplexity which refers to the log-averaged inverse probability on unseen data Wikipedia article on perplexity does not give an intuitive meaning for the same This perplexity
Measuring perplexity over a limited domain in an LLM Measuring perplexity over multiple tokens isn't intrinsically hard: compare the probabilities of the two strings More interesting is this "limited domain"—do you have a finite set of movies a priori?
How to calculate the perplexity of test data versus language models That part I have done (for reference: the models I implemented were a Bigram Letter model, a Laplace smoothing model, a Good Turing smoothing model, and a Katz back-off model) Now, I am tasked with trying to find the perplexity of the test data (the sentences for which I am predicting the language) against each language model