Live Engine
Select Topic
easyText Generation Decoding
A language model generates text by greedy decoding — at each step selecting the token with the highest probability. The output reads: "The best way to learn programming is to practice. practice. practice. practice." What decoding property causes this repetition, and what does greedy decoding optimize for?