Denmark Probes AI Use of Patient Records

Picture of Frederikke Høye

Frederikke Høye

Denmark Probes AI Use of Patient Records

The Danish Data Protection Authority has launched a formal investigation into the Capital Region of Denmark over its use of 3.65 million citizens’ hospital records in an AI research project focused on psychiatric treatment, raising significant data privacy concerns.

Use of Patient Records Under Scrutiny

The Danish Data Protection Authority (Datatilsynet) is investigating whether the Capital Region of Denmark breached rules concerning data protection in a major research initiative utilizing artificial intelligence. The project, called the Precision Psychiatry Initiative, involves analyzing sensitive hospital records from 3.65 million Danish citizens without explicit consent.

The research aims to improve psychiatric care using machine learning, but it has now come under criticism for proceeding without a mandatory risk assessment known as a data protection impact assessment (DPIA). Such an analysis is required by EU GDPR rules when a project involves a high risk to individuals’ personal rights.

The investigation follows a report from Denmark’s national broadcaster DR, which revealed that the Capital Region had approved the project without first conducting this type of analysis. Experts have publicly criticized the lack of basic risk evaluation, calling it a significant oversight.

What the Project Involves

Launched in 2022 and led by Professor Michael Benros from the Capital Region’s psychiatric services, the Precision Psychiatry Initiative received funding of 30 million Danish kroner (approximately $4.3 million) from the Lundbeck Foundation. It aims to harness AI to identify patterns in mental illness and help clinicians better predict and treat psychiatric disorders.

The dataset is vast and includes medical records such as diagnoses, lab test results, and transcriptions of doctor-patient interactions. The records come from anyone who has interacted with hospitals in the Capital Region or in Region Zealand dating back to the year 2000. Furthermore, the approval also covers data up until 2030.

Although the records are pseudonymized, meaning names and direct identifiers have been replaced, yet they are not fully anonymized. Consequently, according to data protection rules, the data is still considered personal and must be handled with strict care.

Potential Legal Consequences

The Data Protection Authority has demanded a full explanation from the Region by September 16. Officials underline that launching a high-risk project without a prior DPIA may be unlawful. If deemed illegal, the project may need to be terminated. Moreover, all collected data, along with any generated research results, could be ordered deleted.

This action would jeopardize years of work and millions in funding already invested since the project’s inception more than two years ago. It would also set a precedent for how AI and sensitive health data can legally be combined in future Danish and European research efforts.

The Region’s Position

The top executive of the Capital Region, Erik Jylling, recently acknowledged the possibility that the region should have conducted the risk assessment. He insisted, however, that the project complied with data protection regulations at the time it was approved. He cited the evolving interpretations of data protection norms as a complicating factor.

Still, this has not satisfied legal and cybersecurity experts, who maintain that a DPIA is a fundamental requirement under GDPR whenever sensitive personal data, especially in such large volumes that we have seen here, is processed.

Wider Implications for Public Institutions

The Danish Data Protection Authority used the case to call on all public institutions, such as municipalities, hospitals, and state agencies, to ensure that enthusiasm for new technology like AI does not override individuals’ privacy rights. It emphasized that adopting innovation must not come at the expense of human rights and legal responsibilities.

What Is a Data Protection Impact Assessment?

A DPIA is a process that public or private entities must conduct when handling data that could present a high risk to individuals. The assessment should:

– Describe how personal data is processed,
– Evaluate whether the data usage is necessary and proportional,
– Identify and mitigate potential risks to affected individuals.

Failure to carry out this assessment before starting projects involving sensitive data could result not only in project shutdowns but also in serious administrative fines or reputational harm.

As AI continues to expand into healthcare and public sectors, the outcome of this case could influence how data protection laws are interpreted and enforced, not just in Denmark, but across the European Union.

author avatar
Frederikke Høye

Leave a Comment

Your email address will not be published. Required fields are marked *


The reCAPTCHA verification period has expired. Please reload the page.

Receive Latest Danish News in English

Click here to receive the weekly newsletter

New Study Finds That Digital Fraud Soars in Denmark

Dating in Denmark

84,00 kr.
New Study Finds That Digital Fraud Soars in Denmark

Danish Open Sandwiches

79,00 kr.
The Danish Dream

Get the daily top News Stories from Denmark in your inbox