When using reasoning models like deepseek it takes a very long time to get a response so streaming the message back would be very nice