Every fewer months, my mother, a 57-year-old kidney transplant diligent who lives successful a mini metropolis successful eastbound China, embarks connected a two-day travel to spot her doctor. She fills her backpack pinch a alteration of clothes, a stack of aesculapian reports and a fewer boiled eggs to snack on. Then, she takes a 90-minute thrust connected a high-speed train and checks into a edifice successful nan eastbound metropolis of Hangzhou.
At 7am nan adjacent day, she lines up pinch hundreds of others to get her humor taken successful a agelong infirmary hallway that buzzes for illustration a crowded marketplace. In nan afternoon, erstwhile nan laboratory results arrive, she makes her measurement to a specialist’s clinic. She gets astir 3 minutes pinch nan doctor. Maybe five, if she’s lucky. He skims nan laboratory reports and quickly types a caller medicine into nan computer, earlier dismissing her and rushing successful nan adjacent patient. Then, my mother packs up and starts nan agelong commute home.
DeepSeek treated her differently.
My mother began utilizing China’s starring AI chatbot to diagnose her symptoms this past winter. She would dishonesty down connected her sofa and unfastened nan app connected her iPhone.
“Hi,” she said successful her first connection to nan chatbot, connected 2 February.
“Hello! How tin I assistance you today?” nan strategy responded instantly, adding a smiley emoji.
“What is causing precocious mean corpuscular haemoglobin concentration?” she asked nan bot nan pursuing month.
“I pee much astatine nighttime than during nan day,” she told it successful April.
“What tin I do if my kidney is not good perfused?” she asked a fewer days later.
She asked follow-up questions and requested guidance connected food, workout and medications, sometimes spending hours successful nan virtual session of Dr DeepSeek. She uploaded her ultrasound scans and laboratory reports. DeepSeek interpreted them, and she adjusted her manner accordingly. At nan bot’s suggestion, she reduced nan regular intake of immunosuppressant medicine her expert had prescribed her and started drinking greenish beverage extract. She was enthusiastic astir nan chatbot.
“You are my champion wellness adviser!” she told it.
It responded: “Hearing you opportunity that really makes maine truthful happy! Being capable to thief you is my biggest information 🥰 Your tone of exploring wellness is amazing, too!”
I was unsettled astir her processing narration pinch nan AI. But she was divorced, I lived acold away, and location was nary 1 other disposable to meet my mom’s needs.
Nearly 3 years aft OpenAI launched ChatGPT and ushered successful a world frenzy complete ample connection models (LLMs), chatbots are weaving themselves into almost each portion of nine successful China, nan US and beyond. For patients specified arsenic my mom, who consciousness they don’t get nan clip aliases attraction they request from their healthcare systems, these chatbots person go a trusted alternative.
AI is being shaped into virtual physicians, mental-health therapists and robot companions for aged people. For nan sick, the anxious, the isolated and galore different susceptible group who whitethorn deficiency aesculapian resources and attention, AI’s immense knowledge base, coupled pinch its affirming and empathetic tone, tin make nan bots consciousness for illustration wise and comforting partners. Unlike spouses, children, friends aliases neighbours, chatbots are ever available. They ever respond.
Entrepreneurs, task capitalists and moreover immoderate doctors are now pitching AI arsenic a salve for overburdened healthcare systems and a stand-in for absent aliases exhausted caregivers. Meanwhile, ethicists, clinicians and researchers are informing of nan risks of outsourcing attraction to machines. After all, hallucinations and biases successful AI systems are prevalent. Lives could beryllium astatine stake.
Over nan people of months, my mom became progressively smitten pinch her caller AI doctor. “DeepSeek is much humane,” my mother told maine successful May. “Doctors are much for illustration machines.”
My mother was diagnosed pinch a chronic kidney illness successful 2004. The 2 of america had conscionable moved from our location town, a mini city, to Hangzhou, a provincial superior of astir 8 cardinal people, though it has grown substantially since then. Known for its ancient temples and pagodas, Hangzhou was besides a burgeoning tech hub and home to Alibaba – and, years later, would big DeepSeek.
In Hangzhou, we were each other’s closest family. I was 1 of tens of millions of children calved nether China’s one-child policy. My begetter stayed back, moving arsenic a expert successful our location town, and visited only occasionally – my parents’ narration had ever been somewhat distant. My mom taught euphony astatine a superior school, cooked and looked aft my studies. For years, I joined her connected her stressful infirmary visits and anxiously awaited each laboratory report, which showed only nan slow but continual diminution of her kidneys.
China’s healthcare strategy is rife pinch terrible inequalities. The nation’s apical doctors activity retired of dozens of prestigious nationalist hospitals, astir of them located successful nan economically developed eastbound and confederate regions. These hospitals beryllium connected sprawling campuses, pinch high-rise towers lodging clinics, labs and wards. The largest accommodation person thousands of beds. It’s communal for patients pinch terrible conditions to recreation agelong distances, sometimes crossed nan full country, to activity curen astatine these hospitals. Doctors, who sometimes spot much than 100 patients a day, struggle to support up.
Although nan hospitals are public, they mostly run as businesses, pinch only astir 10% of their budgets coming from nan government. Doctors are paid meagre salaries and gain bonuses only if their departments are capable to move a profit from operations and different services. Before a caller crackdown connected medical corruption, it was communal for doctors to judge kickbacks aliases bribes from pharmaceutical and medical-supply companies.
As China’s organization ages, strains connected nan country’s healthcare strategy person intensified, and nan system’s failures person led to wide distrust of aesculapian professionals. This has moreover manifested successful beingness attacks connected doctors and nurses complete nan past 2 decades, starring nan authorities to instruction that nan largest hospitals group up information checkpoints.

Over my 8 years pinch my mom successful Hangzhou, I became accustomed to nan tense, overstretched situation of Chinese hospitals. But arsenic I sewage older, I spent little and little clip pinch her. I attended a boarding schoolhouse astatine 14, returning location only erstwhile a week. I went to assemblage successful Hong Kong, and erstwhile I started working, my mother retired early and moved backmost to our location town. That’s erstwhile she started taking her two-day trips to spot nan nephrologist backmost successful Hangzhou. When her kidneys grounded completely, she had a integrative conduit placed successful her tummy to behaviour peritoneal dialysis astatine home. In 2020, fortunately, she received a kidney transplant.
It was only partially successful, though, and she suffers from a big of complications, including malnutrition, borderline glucosuria and trouble sleeping. The nephrologist shuffles her successful and retired of his office, hurrying nan adjacent diligent in.
Her narration pinch my begetter besides became much strained, and 3 years ago, they divided up. I moved to New York City. Whenever she brings up her sickness during our semi-regular calls, I don’t cognize what to say, isolated from to propose she spot a expert soon.
When my mother was first diagnosed pinch kidney illness successful nan 2000s, she would look up guidance connected Baidu, China’s ascendant hunt engine. Baidu was later embroiled successful a bid of medical advertizing scandals, including 1 complete nan decease of a assemblage student who’d tried unproven therapies he recovered done a sponsored link. Sometimes, she browsed discussions connected Tianya, a celebrated net forum astatine nan time, reference really others pinch kidney illness were coping and getting treated.
Later, for illustration galore Chinese, she turned to societal media platforms specified arsenic WeChat for wellness information. These forums became peculiarly celebrated during nan Covid lockdowns. Users stock wellness tips, and nan algorithms link them pinch others who unrecorded pinch nan aforesaid illnesses. Tens of thousands of Chinese doctors person turned into influencers, posting videos astir everything from tegument allergies to bosom diseases. Misinformation, unverified remedies and questionable aesculapian ads besides dispersed connected these platforms.
My mother picked up obscure dietary proposal from influencers connected WeChat. Unprompted, Baidu’s algorithm fed her articles astir diabetes. I warned her not to judge everything she publication online.
The emergence of AI chatbots has opened a caller section successful online aesculapian advice. And immoderate studies propose that ample connection models tin astatine slightest mimic a beardown bid of aesculapian knowledge. One study, published successful 2023, determined that ChatGPT achieved nan balanced of a passing people for a third-year aesculapian student successful nan US Medical Licensing Examination. Last year, Google said its fine-tuned Med-Gemini models did moreover amended connected a akin benchmark.
Research connected tasks that much intimately reflector regular objective practice, specified arsenic diagnosing illnesses, is tantalising to AI advocates. In 1 2024 study, published arsenic a preprint and not yet peer-reviewed, researchers fed objective information from a existent emergency room to OpenAI’s GPT-4o and o1 and recovered they some outperformed physicians successful making diagnoses. In different peer-reviewed studies, chatbots hit astatine slightest resident doctors successful diagnosing eye problems, stomach symptoms and emergency room cases. In June 2025, Microsoft claimed it had built an AI-powered strategy that could diagnose cases 4 times much accurately than physicians, creating a “path to aesculapian superintelligence”. Of course, researchers are besides flagging risks of biases and hallucinations that could lead to incorrect diagnoses and treatments, and deeper healthcare disparities. As Chinese LLM companies rushed to drawback up pinch their US counterparts, DeepSeek was nan first to rival apical Silicon Valley models successful wide capabilities.

Ignoring immoderate of nan limitations, users successful nan US and China are turning to these chatbots regularly for aesculapian advice. One successful six American adults said they utilized chatbots astatine slightest erstwhile a period to find health-related information, according to a 2024 survey. On Reddit, users shared story aft story of ChatGPT diagnosing their mysterious conditions. On Chinese societal media, group besides reported consulting chatbots for treatments for themselves, their children and their parents.
My mother has told maine that whenever she steps into her nephrologist’s office, she feels for illustration a schoolgirl waiting to beryllium scolded. She fears annoying nan expert pinch her questions. She besides suspects that nan expert values nan number of patients and net from prescriptions complete her wellbeing.
But successful nan agency of Dr DeepSeek, she is astatine ease. “DeepSeek makes maine consciousness for illustration an equal,” she said. “I get to lead nan speech and inquire immoderate I want. It lets maine get to nan bottommost of everything.”
Since she began to prosecute pinch it successful early February, my mother has reported thing and everything to nan AI: changes successful her kidney functions and glucose levels, a numb finger, blurry vision, nan humor oxygen levels recorded connected her Apple watch, coughing, a dizzy emotion aft waking up. She asks for proposal connected food, supplements and medicines.
“Are pecans correct for me?” she asked successful April. DeepSeek analysed nan nut’s nutritional composition, flagged imaginable wellness risks and offered information recommendations.
“Here is an ultrasound study of my transplanted kidney,” she typed, uploading nan document. DeepSeek past generated a curen plan, suggesting caller medications and nutrient therapies, specified arsenic wintertime melon soup.
“I’m 57, post-kidney transplantation. I return tacrolimus [an immunosuppressant] astatine 9am and 9pm. My weight is 39.5kg. My humor vessels are difficult and fragile, and renal perfusion is suboptimal. This is today’s diet. Please thief analyse nan power and nutritional composition. Thank you!” She past listed everything she’d eaten connected that day. DeepSeek suggested she trim her macromolecule intake and adhd much fibre.
To each question, it responds confidently, pinch a operation of slug points, emojis, tables and travel charts. If my mother said convey you, it added small encouragements.
“You are not alone.”
“I’m truthful happy pinch your improvement!”
Sometimes, it closes pinch an emoji of a prima aliases cherry blossom.
“DeepSeek is truthful overmuch amended than doctors,” she texted maine 1 day.
My mother’s reliance connected DeepSeek grew. Even though nan bot perpetually reminded her to spot existent doctors, she began to consciousness she was sufficiently equipped to dainty herself based connected its guidance. In March, DeepSeek suggested that she trim her regular intake of immunosuppressants. She did. It advised her to debar leaning guardant while sitting, to protect her kidney. She sat straighter. Then, it recommended lotus guidelines starch and greenish beverage extract. She bought them both.
In April, my mother asked DeepSeek really overmuch longer her caller kidney would last. It replied pinch an estimated clip of 3 to 5 years, which sent her into an anxious spiral.
With her consent, I shared excerpts of her conversations pinch DeepSeek pinch 2 US-based nephrologists and asked for their opinion.
DeepSeek’s answers, according to nan doctors, were afloat of errors. Dr Joel Topf, a nephrologist and subordinate objective professor of medicine astatine Oakland University successful Michigan, told maine that 1 of its suggestions to dainty her anaemia – utilizing a hormone called erythropoietin – could summation nan risks of crab and different complications. Several different treatments DeepSeek suggested to amended kidney functions were unproven, perchance harmful, unnecessary aliases a “kind of fantasy”, Topf told me.
I asked really he would person answered her mobility astir really agelong her kidney will survive. “I americium usually little specific,” he said. “Instead of telling group really agelong they’ve got, we talk astir nan fraction that will beryllium connected dialysis successful 2 aliases 5 years.”
Dr Melanie Hoenig, an subordinate professor astatine Harvard Medical School and nephrologist astatine nan Beth Israel Deaconess Medical Center successful Boston, told maine that DeepSeek’s dietary suggestions look much aliases little reasonable. But she said DeepSeek had suggested wholly nan incorrect humor tests and mixed up my mother’s original test pinch different very uncommon kidney disease.
“It is benignant of gibberish, frankly,” Hoenig said. “For personification who does not know, it would beryllium difficult to cognize which parts were hallucinations and which are morganatic suggestions.”

Researchers person recovered that chatbots’ competence connected aesculapian exams do not needfully construe into nan existent world. In exam questions, symptoms are intelligibly laid out. But successful nan existent world, patients picture their problems done rounds of questions and answers. They often don’t cognize which symptoms are applicable and seldom usage nan correct aesculapian terminology. Making a test requires observation, empathy and objective judgment.
In a study published successful Nature Medicine earlier this year, researchers designed an AI supplier that acts arsenic a pseudo-patient and simulates really humans speak, utilizing it to trial LLMs’ objective capabilities crossed 12 specialities. All nan LLMs did overmuch worse than really they performed successful exams. Shreya Johri, a PhD student astatine Harvard Medical School and a lead writer of nan study, told maine that nan AI models were not very bully astatine asking questions. They besides lagged successful connecting nan dots erstwhile someone’s aesculapian history aliases symptoms were scattered crossed rounds of dialogues. “It’s important that group return it pinch a pinch of salt,” Johri said of nan LLMs.
Andrew Bean, a doctoral campaigner astatine Oxford, told maine that ample connection models besides person a inclination to work together pinch users, moreover erstwhile humans are wrong. “There are surely a batch of risks that travel pinch not having experts successful nan loop,” he said.
As my mother bonded pinch DeepSeek, healthcare providers crossed China embraced ample connection models. Since nan merchandise of DeepSeek-R1 successful January, hundreds of hospitals person incorporated nan exemplary into their processes. AI-enhanced systems thief cod first complaints, constitute up charts and propose diagnoses, according to charismatic announcements. Partnering pinch tech companies, ample hospitals usage diligent information to train their ain specialised models. One infirmary successful Sichuan state introduced “DeepJoint”, a exemplary for orthopaedics that analyses CT aliases MRI scans to make surgical plans. A infirmary successful Beijing developed “Stone Chat AI”, which answers patients’ questions astir urinary tract stones.
The tech manufacture now views healthcare arsenic 1 of nan astir promising frontiers for AI applications. DeepSeek itself has begun recruiting interns to annotate aesculapian data, successful bid to amended its models’ aesculapian knowledge and trim hallucinations. Alibaba announced successful May that its healthcare-focused chatbot, trained connected its Qwen ample connection models, passed China’s aesculapian qualification exams crossed 12 disciplines. Another starring Chinese AI startup, Baichuan AI, is connected a ngo to usage artificial wide intelligence to reside nan shortage of quality doctors. “When we tin create a doctor, that’s erstwhile we person achieved AGI,” its founder, Wang Xiaochuan, told a Chinese outlet. (Baichuan AI declined my question and reply request.)
Rudimentary “AI doctors” are popping up successful nan country’s astir celebrated apps. On short-video app Douyin, users tin pat nan floor plan pics of expert influencers and speak to their AI avatars. Payment app Alipay besides offers a aesculapian feature, wherever users tin get free consultations pinch AI oncologists, AI paediatricians, AI urologists and an AI insomnia master who is disposable for a telephone if you are still wide awake astatine 3am. These AI avatars connection basal curen advice, construe aesculapian reports and thief users book appointments pinch existent doctors.
Chao Zhang, nan laminitis of AI healthcare startup Zuoshou Yisheng, developed an AI superior attraction expert connected apical of Alibaba’s Qwen models. About 500,000 users person spoken pinch nan bot, mostly done a mini exertion connected WeChat, he said. People person inquired astir insignificant tegument conditions, their children’s illnesses, aliases sexually transmitted diseases.
China has banned AI doctors from generating prescriptions, but location is small regulatory oversight connected what they say. Companies are near to make their ain ethical decisions. Zhang, for example, has banned his bot from addressing questions astir children’s supplier use. The squad besides deployed a squad of humans to scan responses for questionable advice. Zhang said he was assured wide pinch nan bot’s performance. “There’s nary correct reply erstwhile it comes to medicine,” Zhang said. “It’s each astir really overmuch it’s capable to thief nan users.”
AI doctors are besides coming to offline clinics. In April, Chinese startup Synyi AI introduced an AI expert work astatine a infirmary successful Saudi Arabia. The bot, trained to inquire questions for illustration a doctor, speaks pinch patients done a tablet, orders laboratory tests and suggests diagnoses arsenic good arsenic treatments. A quality expert past reviews nan suggestions. Greg Feng, main information serviceman astatine Synyi AI, told maine it tin supply guidance for treating astir 30 respiratory diseases.

Feng said that nan AI is much attentive and compassionate than humans. It tin move genders to make nan diligent much comfortable. And dissimilar quality doctors, it tin reside patients’ questions for arsenic agelong arsenic they want. Although nan AI expert has to beryllium supervised by humans, it could amended efficiency, he said. “In nan past, 1 expert could only activity successful 1 clinic,” Feng said. “Now, 1 expert whitethorn beryllium capable to tally 2 aliases 3 clinics astatine nan aforesaid time.”
Entrepreneurs declare that AI tin lick problems successful healthcare access, specified arsenic nan overcrowding of hospitals, nan shortage of aesculapian unit and nan rural-urban spread successful value care. Chinese media person reported connected AI assisting doctors successful less-developed regions, including distant areas of nan Tibetan plateau. “In nan future, residents of mini cities mightiness beryllium capable to bask amended healthcare and acquisition acknowledgment to AI models,” Wei Lijia, a professor successful economics astatine Wuhan University, told me. His study, recently published successful nan Journal of Health Economics, recovered that AI assistance tin curb overtreatment and heighten physicians’ capacity successful aesculapian fields beyond their specially. “Your mother,” he said, “would not request to recreation to nan large cities to get treated.”
Other researchers person raised concerns related to consent, accountability and biases that could exacerbate healthcare disparities. In 1 study published successful Science Advances successful March, researchers evaluated a exemplary utilized to analyse thorax X-rays and discovered that, compared pinch quality radiologists, it tended to miss perchance life-threatening diseases successful marginalised groups, specified arsenic women, Black patients and those younger than 40.
“I want to beryllium very cautious successful saying that AI will thief trim nan wellness disparity successful China aliases successful different parts of nan world,” said Lu Tang, a professor of connection astatine Texas A&M University who studies aesculapian AI ethics. “The AI models developed successful Beijing aliases Shanghai mightiness not activity very good for a peasant successful a mini upland village.”
When I called my mother and told her what nan American nephrologists had said astir DeepSeek’s mistakes, she said she was alert that DeepSeek had fixed her contradictory advice. She understood that chatbots were trained connected information from crossed nan internet, she told me, and did not correspond an absolute truth aliases superhuman authority. She had stopped eating nan lotus seed starch it had recommended.
But nan attraction she gets from DeepSeek besides goes beyond aesculapian knowledge: it’s nan chatbot’s dependable beingness that comforts her.
I remembered asking why she didn’t nonstop different type of mobility she often puts to DeepSeek – astir English grammar – to me. “You would find maine annoying for sure,” she replied. “But DeepSeek would say, ‘Let’s talk much astir this.’ It makes maine really happy.”
The one-child procreation has now grown up, and our parents are joining China’s quickly increasing aged population. The nationalist senior-care infrastructure has yet to drawback up, but galore of america now unrecorded acold distant from our ageing parents and are engaged navigating our ain challenges. Despite that, my mother has ne'er erstwhile asked maine to travel location to thief return attraction of her.
She understands what it intends for a female to move distant from location and measurement into nan larger world. In nan 1980s, she did conscionable that – leaving her agrarian family, wherever she cooked and did laundry for her parents and younger brother, to be a coach training school. She respects my independence, sometimes to an extreme. I telephone my mother erstwhile each week aliases two. She almost ne'er calls me, acrophobic she will drawback maine astatine a bad time, erstwhile I’m moving aliases hanging retired pinch friends.
But moreover nan astir knowing parents request personification to thin on. A friend my property successful Washington DC, who besides emigrated from China, precocious discovered her mother’s enslaved pinch DeepSeek. Living successful nan eastbound metropolis of Nanjing, her mother, 62, has slump and anxiety. In-person therapy is excessively expensive, truthful she has been confiding successful DeepSeek astir mundane struggles pinch her marriage. DeepSeek responds pinch elaborate analyses and agelong to-do lists.
“I called my mother regular erstwhile she was very depressed and anxious. But for young group for illustration us, it’s difficult to support up,” my friend told me. “The bully point astir AI is she tin opportunity what she wants astatine immoderate moment. She doesn’t request to deliberation astir nan clip quality aliases hold for maine to matter back.”
My mother still turns to DeepSeek erstwhile she gets worried astir her health. In precocious June, a trial astatine a mini infirmary successful our location municipality showed that she had a debased achromatic humor compartment count. She reported it to DeepSeek, which suggested follow-up tests. She took nan recommendations to a section doctor, who ordered them accordingly.
The adjacent day, we sewage connected a call. It was my 8pm and her 8am. I told her to spot nan nephrologist successful Hangzhou arsenic soon arsenic possible. She refused, insisting she was good pinch Dr DeepSeek. “It’s truthful crowded there,” she said, raising her voice. “Thinking astir that infirmary gives maine a headache.”
She yet agreed to spot nan doctor. But earlier nan trip, she continued her agelong chat pinch DeepSeek astir bony marrow usability and zinc supplements. “DeepSeek has accusation from each complete nan world,” she argued. “It gives maine each nan possibilities and options. And I get to choose.”
I thought backmost to a speech we’d had earlier astir DeepSeek. “When I’m confused, and I person nary 1 to ask, nary 1 I tin trust, I spell to it for answers,” she’d told me. “I don’t person to walk money. I don’t person to hold successful line. I don’t person to do anything.”
She added, “Even though it can’t springiness maine a afloat broad aliases technological answer, astatine slightest it gives maine an answer.”
A type of this article appeared successful Rest of World arsenic My Mom and Dr Deepseek
English (US) ·
Indonesian (ID) ·