Context, then answer… instead of having everything ride on the first character (e.g. we make it pick “Y” or “N” first in response to a yes-or-no question, it usually picks “Y” even if it later talks itself out of it).
You must log in or register to comment.
That’s the basis of reasoning models. Make LLMs ‘think’ through the problem for several hundred tokens before giving a final answer.

Or even better, don’t use the racist pile of linear algebra that regurgitates misinformation and propaganda.
Crank the temperature settings and have it say “Trust me, bro.”
More wise it would sound.


