To be blunt, it happens sometimes, but not as often as you think.
It is the nature of current LLM's to sometimes suffer from hallucinations, a term for making up information, or other faulty forms of data return. Also, we are not doctor's or medical professionals, and do not claim to be.
As we say in the Bestie Chat pages:
Your AI interactions here are with an experimental customized agent. Each message can take awhile to populate. It is not meant to replace accredited medical or other advice at this time. Do no put personal, identifying information in this chat.
We do, however, hope that the information provided can give you ideas and other support.