Doctors Warn as Saudis Increasingly Use AI for Medical Advice First
Saudis Turn to AI for Medical Answers, Doctors Caution

Artificial intelligence is increasingly shaping how Saudis approach their health, with many turning to AI tools before visiting clinics. From symptom checks to diet planning, chatbots now provide tailored responses within seconds, altering the traditional doctor-first model.

Shift in Patient Behavior

Family physician Dr. Mohannad Al-Qarni observed a significant change: patients often treat physicians as a second opinion after AI. In many cases, they arrive with AI-generated interpretations of symptoms and medications, reshaping consultations. Patients ask direct questions shaped by AI, such as whether they need a colonoscopy, using AI as a starting point for confirmation. The challenge lies in ensuring accuracy and context to avoid unnecessary anxiety.

Personal Experiences with AI

Reem Al-Harbi, 27, who has PCOS and IBS, used ChatGPT to manage digestive issues. The chatbot suggested laxatives and a low FODMAP diet for eight weeks, which helped her identify onions and gluten as triggers. She now relies on AI for meal planning and tracking. However, Hayat Hasan, 32, uses AI cautiously for recipes and emotional support but stresses it should not replace professionals. She notes that AI lacks knowledge of medical history and consequences.

Wide Pickt banner — collaborative shopping lists app for Telegram, phone mockup with grocery list

Noura Al-Harbi used AI to help her mother, who had stomach pain and received conflicting diagnoses. The chatbot suggested acid reflux and gastritis, later confirmed by a doctor. Sarah Al-Zahrani uploaded her mother's pneumonia lab reports to ChatGPT for interpretation, which helped her formulate questions for the doctor.

Risks and Concerns

Doctors warn that AI can over-interpret minor abnormalities or miss clinical context, causing unnecessary worry. Psychiatrist Dr. Heba AlSaad highlights risks of misdiagnosis, especially confusing normal stress with psychiatric disorders or missing serious conditions like bipolar disorder or suicidality. AI cannot interpret non-verbal cues or assess risk levels. Overuse may worsen anxiety or obsessive thinking. She acknowledges AI's supportive role in psychoeducation, mood tracking, and organizing questions, but insists it should not replace clinical diagnosis, risk assessment, or treatment decisions.

Systemic Factors and Adaptation

The popularity of AI reflects gaps in mental health accessibility and affordability. Doctors are adapting, with some using AI as a support tool. Users like Mazyar Ali Nicolas Javadi recognize limitations, noting that AI can estimate macros incorrectly and should not be used for mental health. Beyond consumer use, AI is entering clinical infrastructure. HakeemDx, a Saudi-based clinical decision support system founded by Bilal Adi, combines language models with trusted medical guidelines to assist clinicians, positioning itself as an assistant rather than a replacement.

Pickt after-article banner — collaborative shopping lists app with family illustration