We need a way to cancel the current answer generation.
This is useful in the case that the LLM generates a long answer and the user notices that a crucial detail is missing from the prompt. Or when a parameter like the temperature was accidentally set to a high value such that the answer becomes a very long string of gibberish.
An idea would be that the send button becomes a stop button, when the LLM is generating an answer.