Open main menu
Article
Quizzes
Tools
EN
Article
Quizzes
Tools
All quizzes
/
LLM Fundamentals
/
What does ...
What does setting temperature=0 do when calling an LLM API?
It makes the model deterministic and focused, always picking the highest-probability token
It disables the model's output entirely
It makes the model maximally creative and random
It reduces the output to exactly one token
Submit answers