Model garbage response

Still I am getting the following garbage response from the model.
Your ​ ​ ​ … … … … … … … … … … … … … … … … … … … … … … … … … … … …

The response appears garbled. Need to output proper final spoken reply: “Your report has been sent successfully. Is there anything else I can help you with?” Probably.Your report has been sent successfully. Is there anything else I can help you with?

Although I have set my ChatGroq API as follows

llm = ChatGroq(

    model=settings.GROQ_LLM_MODEL,

    api_key=settings.GROQ_API_KEY,

    temperature=0,

    max_tokens=800,

    max_retries=2,

    reasoning_format="hidden",

)

I need to know the reason why the reasoning tokens are exposed in the response.

@meshari @Nikhil_Kaushal @yawnxyz @benank