What I learnt when I tried replacing my doctor with an algorithm
A health scare kept me on the edge… until reflection helped me find comfort in the one thing I know best
It was a warm August morning. At first, it felt like any other day.
My morning began as usual: wake at 7.15, follow routine, and sip sugarless tea—a necessity since my diabetes diagnosis. By 9.30, breakfast was done and hands were washed.
Moments after my medication, the familiar burn returned—a dull ache that had lingered since 2017, when diabetes entered my life. My usual remedies, cool water and buttermilk, offered only brief relief.
The discomfort persisted for days, growing sharper with spicy food. I had never had trouble with spice before, but now even a trace of red chilli ignited deep unease in my stomach.
I brushed it aside until I could not. A new, persistent buzzing appeared above my liver.
Alarmed, I cut out spice and salt, turning to mild comfort foods—sprouted green beans, flattened rice with lentils and vegetables, and plenty of buttermilk—to calm the irritation.
Driven by worry, I entered every detail into Gemini, hoping to make sense of my symptoms. I understood the limits: it could only offer general information, not a diagnosis. Still, I was desperate for any clue.
I asked about long-term non-alcoholic fatty liver disease and potential liver damage. I had lived with type 2 diabetes for more than a decade, but not always responsibly. There were lapses—junk food, missed walks, late nights spent writing instead of resting.
Gemini’s responses were both reassuring and unnerving.
Reassuring, because they spoke calmly about lifestyle, diet, and exercise—actions within my control.
Unnerving, because every few lines carried an implicit warning: if ignored, the condition could progress to fibrosis, cirrhosis, or worse.
I stared blankly at the screen as my screen reader repeated those words—cirrhosis, fibrosis, liver failure. They were no longer abstract; they loomed like a shadow over my future. Was it reason or denial that guided me now?
I bombarded Gemini with more questions, hoping one of them would finally reveal the truth.
________________________________________
When Searching Becomes Obsession
The urge to keep asking is overwhelming. You think, one more question, one more symptom—then I will know.
But digital tools, no matter how advanced, cannot know your body or history. They only see patterns, not your reality.
Each search fed my anxiety. What began as curiosity turned quickly into fear.
I moved up my doctor’s appointment and underwent an abdominal ultrasound. The report showed mild inflammation linked to diabetes—not alarming, my diabetologist said.
Relief lasted only a few hours. Soon I was back online, re-reading symptoms, connecting every sensation to the worst possible outcome.
Days between appointments stretched endlessly. I would wake at 2 am, reach for my phone, and retype the same questions I had asked the night before.
________________________________________
Choosing to Act
Then, it happened on a Tuesday afternoon, three days after I received the ultrasound results.
I sat at my desk, the cursor blinking on an empty page. My fingers hovered, ready to type yet another question whose answer I already knew.
Then I stopped and reviewed what I did about my fatty liver condition until that point. All I managed was to get frantic, feed more questions to Gemini, get overwhelmed about what I read, manufacture more questions in my head, and go at it all over again: a veritable Sisyphean attempt at pushing myself deeper into anxiety.
A series of deep breaths later, my logical mind took over with a simple question: Is it worth the trouble?
The answer was equally simple: no. But what exactly was I saying no to?
I could read numerous medical articles generated by AI, but I am not qualified to interpret the actual symptoms in the text and draw a conclusion. Even with medical consultation, there will be some loose ends, and I will probably have to live with the uncertainty instead of bolting up from bed after every sweaty realization that something could be serious and burn the midnight oil about things I no longer completely understand.
And yes, I have done this before, although it was a long time ago, when I was a teenager. At that time, my doctors were unsure how and when my blindness would progress. This brought similar uncertainties and even deeper existential questions. But I accepted the condition then. So, what will stop me from accepting the reality that there are things beyond my control?
For weeks I had searched for certainty outside myself. Now I was ready to find it within.
The fear had drained me. I no longer wanted to search—I wanted to act.
A new urgency took over—the need to write.
I had so much to say, so much left unsaid. This was not about importance. It was about leaving a trace, a record of what I had lived for.
Perhaps someone in my family would one day find it and see the world through my lens. I have often wished for that myself—to know how my grandfathers lived, how they saw the world change around them.
I know only fragments. Two of my maternal grand uncles worked at Bhagwan Ramana Maharshi’s ashram, and my paternal grandfather had a story of entrepreneurship. Beyond that, silence.
That silence felt heavier than illness.
I acted. I closed the browser, opened my long-neglected manuscript, and began to write. The fear remained, but now it was fuel. Time felt short—my story had to be told.
That small decision—to live instead of search—taught me something profound: technology can support, but not define, my reality. The lesson was simple—trust experience, prepare, but do not let any tool replace your judgment or agency.
________________________________________
Learning to Partner with AI
No new answer could save me. What I needed was a rule I had forgotten in panic: prepare, do not look for replacement in tools or advice.
Medical websites and technology can summarise conditions, list possible causes, and suggest what to discuss with a doctor. They are convenient—but also seductive.
Doctors told me patients who used AI beforehand often described their symptoms more clearly. They came prepared, and that helped.
A 2025 study by Yale and the University of Zurich reported similar results. AI-based mental health tools reduced anxiety for some users but heightened it for others—especially those who sought comfort from chatbots. The more reassurance we seek from a machine, the further we drift from our own calm.
So I set a limit: one session with AI, thirty minutes, no more. I would gather questions, note down terms, and prepare for my appointments. Then I would stop. Note down necessary details, but not yield to the temptation of travelling with the AI’s hallucinations.
This would work the same for you: if your heart races or thoughts spiral after using AI, recognise that as anxiety, not clarity. When curiosity becomes compulsion, pause.
Schedule AI-free hours. Keep devices out of restful spaces. Before and after each session, ask yourself honestly: Do I feel calm or unsettled? If the heaviness remains, take a break.
The same principle now shapes my writing. Technology helps me brainstorm and untangle ideas, but I keep the words my own.
If I let a tool rewrite my sentences or validate my story, I am no longer writing—only hiding behind fluency.
Like panicking patients, writers also face the same trap. It is easy to stay up at 3 am, questioning: Is this good enough? Does this sound authentic? Should I cut this scene? We mistake digital confidence for discernment.
No tool can tell you if a metaphor moves a reader to tears, or if your confession helps someone feel less alone. Technology can analyse patterns, but it cannot feel the weight of your thoughts. You should believe in your own words, and its ability to move people instead of seeking validation from AI tools all the time.
The thirty-minute rule applies here too. Use tools to spark ideas or point out gaps—but never to diagnose your voice or measure your worth.
Technology can sound wise, but it never brings the peace that comes from your own lived experience.
________________________________________
A Different Kind of Control
Each morning still brings some discomfort and unease. I am awaiting my gastroenterologist appointment.
But I am no longer paralysed. I am finishing the books I delayed for years. I walk in the mornings again. I eat to nourish, not punish, my body.
And I use technology differently—mindfully.
When anxiety creeps in—about my health or my work—I acknowledge it, give it ten minutes, then move on. I talk to friends. I breathe mindfully. I hold on to routines. Physical steadiness brings emotional steadiness, and that steadiness allows deep work.
I now frame my questions like a journalist seeking facts, not a patient seeking prophecy. I verify everything with trusted doctors or credible sources.
If anxiety becomes severe or begins to disrupt sleep, work, or relationships, I know it is time to seek medical help. Apps such as Wysa or Abby can aid reflection, but they cannot replace genuine connection. Community, not code, steadies the mind.
This sense of reality has given me something no algorithm can—a sense of control over my inner landscape.
If you too have found yourself feeding fears into a chatbot at 2 am, asking it to solve what only time, professionals, or courage can heal, you already know this truth.
The tool that promised clarity can easily become the source of confusion.
You can stop. You can use AI as a preparation tool, not an emotional crutch. You can give yourself permission to not know everything now. You can wait for real answers from real people who understand your story.
Ultimately, knowledge does not always bring peace. Understanding does.
So I ask once, breathe twice, and close the tab. I sit in the quiet that no algorithm can simulate—a silence that feels, finally, like peace.e

Interesting observation! Thanks for sharing!
In my case, I use GPT and Claude for double checking doctor’s advice. And it works IMO good as a second (or third opinion). So you can see multiple points of view. (Inspired by my mom, who as a doctor always says ‘As many doctors – as many opinions’