A Danish woman checks her AI chatbot up to ten times daily for reassurance about her anxiety, but the constant queries only deepen her distress. Her story reflects a growing problem across Denmark, where accessibility to AI mental health tools is creating a new kind of dependency that experts warn may be making things worse.
The woman’s pattern is straightforward. She wakes up anxious. She types her fears into a chatbot. It responds with calm, measured language. She feels better for an hour, maybe two. Then the anxiety returns, sharper than before. So she asks again. And again. As reported by TV2, she describes the chatbot as never growing tired of her, unlike friends or family who eventually run out of patience or energy.
I have lived in Denmark long enough to recognize this pattern. It is the same impulse that drives people to call the health hotline late at night, to refresh their symptom checker apps compulsively, to seek certainty in a system that often demands you wait six months for actual therapy. The difference now is that the machine never tells you to stop calling.
The Scale of the Problem
Approximately 300,000 Danes experience anxiety disorders annually, according to national health statistics. The average wait time for psychological therapy now exceeds six months in most regions. Into this gap, healthcare chatbots and AI tools have rushed, promising immediate support without appointments or waiting lists.
Danish Health Authority data from the first quarter of 2026 shows a 15 percent increase in anxiety related GP consultations where patients mentioned using digital tools. Chatbot usage for mental health has jumped 40 percent since 2024. These numbers tell a story of people reaching for whatever works right now, consequences be damned.
The problem is that it does not actually work. A March 2026 study by Psykologforeningen found that 25 percent of frequent chatbot users reported worsened symptoms after three months of regular use. The AI provides what psychologists call temporary relief without addressing root causes. Each query reinforces the belief that you cannot manage your own distress, that you need an external voice to tell you everything will be fine.
Why the Bot Never Gets Tired
This is where AI becomes uniquely dangerous for people with anxiety. A human therapist sets boundaries. They schedule sessions. They sometimes tell you that what you are experiencing is normal and does not require constant monitoring. A chatbot has no such limits.
As one senior psychologist from Region Hovedstaden noted in April, chatbots are never tired of us, but they do not heal us either. They extend the suffering by creating what experts now call digital reassurance loops. The woman in the TV2 article is trapped in exactly this cycle. Her anxiety drives her to the bot. The bot soothes her. The soothing wears off faster each time. Her anxiety, now trained to expect instant relief, returns with greater urgency.
I have watched this dynamic play out in other contexts here. Denmark’s universal healthcare system is excellent at many things, but it struggles with capacity. Mental health services are chronically understaffed. So people improvise. Some turn to apps. Others turn to chatbots. The system’s failure creates the conditions for a different kind of dependency.
What Happens Next
The Danish government has noticed. The 2026 National Mental Health Strategy includes DKK 500 million for digital therapy expansion, but with a catch. All AI tools must now undergo clinical trials before being recommended for mental health use. Sundhedsstyrelsen is piloting AI oversight programs in five regions starting this spring.
This regulatory attention mirrors broader European concerns. The EU’s AI Act, which took effect in 2024, classifies mental health chatbots as high risk systems requiring transparency and accountability. Denmark is complying through new Sundhedsstyrelsen guidelines that emphasize hybrid care models, where AI serves as a supplement to human therapy rather than a replacement.
The challenge is timing. Policy moves slowly. Anxiety does not. Right now, tens of thousands of Danes are in the same position as the woman checking her chatbot ten times a day. They are waiting for therapy slots that may not open for months. They are using tools that provide comfort in the moment while potentially deepening the problem over time.
As someone who has navigated Denmark’s health system as an expat, I understand the appeal of the quick fix. But I also know that the best parts of Danish healthcare come from the human relationships, the continuity of care, the sense that someone is actually tracking your progress over time. A chatbot can fake empathy. It cannot provide the thing that actually helps people recover, which is connection, accountability, and the difficult work of changing thought patterns under professional guidance.
The woman’s story is not unique. It is becoming normal. That should worry all of us.
Sources and References
TV2: Hun spørger chatten ti gange om dagen om sin angst: Den bliver ikke træt af mig
The Danish Dream: Danish Healthcare Explained for Tourists & Expats
The Danish Dream: Rigshospitalet Offers Inclusive Care for LGBTQ Families in Denmark
The Danish Dream: AI Skills Now Essential in Danish Job Market








