As the March 15 deadline for quota 2 applications approaches, a study counselor at Aarhus University is encouraging students to use AI language models to help them reflect on their study choices. The recommendation comes with clear warnings that AI must supplement, not replace, human guidance.
AI as a Tool for Study Reflection
Laura Munk Petersen, a study counselor at Aarhus University, is actively promoting AI use as the quota 2 application deadline draws near. She demonstrated on P4 Østjylland how students can assign specific roles to AI language models, such as asking the tool to act as a study counselor who only asks questions rather than providing direct answers. This approach helps students explore their interests, motivations, and dreams through open reflection questions.
The Backpack Method
The university has published a guide on its Rygsækken website that explains how to use AI prompts for study choice reflection. The guide provides specific instructions on how to interact with language models effectively. Petersen emphasizes that the real value lies in using AI as a reflection tool, a wall to bounce ideas off rather than a source of definitive answers.
Students can use the technology to examine their own experiences, interests, and hobbies before committing to an educational path. The process involves feeding the AI model detailed information about personal background and goals. In return, the model generates thoughtful questions that help students discover patterns and preferences they might not have considered independently.
Timing and Accessibility
The recommendation comes at a critical moment. Sunday, March 15 at noon marks the absolute final deadline for quota 2 applications this round. Quota 2 allows applicants to be assessed based on a holistic evaluation that weighs experience, motivation, and other qualifications beyond just grade point averages. In contrast, quota 1 admission relies solely on academic grades.
For the thousands of young people facing this major career decision, accessible tools matter. According to research from Studievalg Danmark, approximately 70 percent of upper secondary students use ug.dk for education information. AI could streamline this search process during high pressure periods.
Clear Limitations and Warnings
Despite her encouragement, Petersen is emphatic about AI’s shortcomings and the boundaries of its usefulness. She identifies several red flags that students must keep in mind when using these tools.
Technical Accuracy Problems
AI language models perform poorly when it comes to specific factual details about educational programs. Petersen warns that the technology is particularly unreliable regarding admission requirements, deadlines, and general rules. Students should never rely on AI for these concrete details. Instead, they should verify all factual information through official channels and institutional websites.
The quality of AI output depends entirely on the quality of input it receives. Students must be precise and clear when formulating their prompts. Vague or poorly constructed questions will generate equally vague and unhelpful responses.
The Human Element Remains Essential
Petersen stresses that AI cannot replace conversations with friends, family, and professional study counselors. The human perspective remains crucial for making informed decisions about education and career paths. The technology serves as a complement to traditional guidance, not a substitute for it.
She describes AI as a good tool that cannot stand alone. The personal, nuanced understanding that comes from discussing options with people who know you well remains irreplaceable. These conversations provide context, challenge assumptions, and offer emotional support that no algorithm can replicate.
National Context and Policy Framework
The recommendation from Aarhus University aligns with broader national trends in Danish education. The approach reflects an evolving relationship between technology and learning institutions.
Government Guidelines on AI Use
In January 2026, Digitaliseringsstyrelsen issued updated guidelines that permit AI as a learning tool in Danish folk schools, gymnasiums, and higher education institutions. The guidelines, developed with Børne- og Undervisningsministeriet, replace interim rules from 2024. They emphasize a balanced approach that encourages AI for concept explanation, research, brainstorming, and translation during the learning process.
However, the same guidelines ban generative AI tools like ChatGPT, Claude, and Gemini during most exams, effective from February 1, 2026. This prohibition applies to folkeskole final exams, gymnasium exams across STX, HHX, HTX, and HF programs, and most higher education assessments. Exceptions exist for IT and technical subjects where AI use is part of the learning objectives.
Educational Reality and Preparation
Despite the increasing presence of AI in educational settings, only about one in three students currently receives formal AI instruction. This gap makes practical guidance from counselors particularly valuable. Schools must develop local AI policies and provide teacher training to implement the national framework effectively.
The government has also introduced experimental programs. Education Minister Mattias Tesfaye announced that from 2026, high school students at volunteer schools may use generative AI during one hour of preparation for oral English exams. This pilot tests controlled AI integration while maintaining exam integrity through supervision.
Expert Perspectives on AI in Guidance
The conversation extends beyond individual universities to national guidance organizations. Their views provide important context for understanding both opportunities and risks.
Improving Information Access
Mathilde Tronegård, director of Studievalg Danmark, supports AI enhancements for platforms like ug.dk. She believes such improvements could help students search for information more efficiently. This efficiency could free guidance counselors to spend more time on complex, individualized conversations that require human judgment and empathy.
The upcoming epx reform, launching in 2030, will mandate AI integration in both collective and individual guidance. This requirement reflects government recognition that AI tools have become permanent features of the educational landscape. The reform, supported by SF and Dansk Folkeparti, aims to prepare students for a workforce where AI literacy is increasingly essential.
Concerns About Bias and Overreliance
Tronegård also raises important cautions. She warns against personalized AI recommendations that could introduce bias into study choices. Such systems might steer students toward certain programs based on patterns that don’t account for individual circumstances or aspirations. The risk of inaccuracy increases significantly when AI operates without human oversight.
Teaching students proper prompt skills becomes critical in this context. Without understanding how to formulate effective questions and critically evaluate AI responses, students may receive unreliable information at crucial decision points. This concern is particularly relevant given that student use of chatbots for education queries is rising but remains limited in sophistication.
Broader Implications for Danish Education
The integration of AI into study guidance reflects larger shifts happening across Danish educational institutions. These changes bring both promises and challenges that educators continue to navigate.
EU Regulatory Framework
The EU AI Act’s key requirements take effect in August 2026, just months after the current application deadline. The act classifies education AI systems, particularly those used for assessment and admissions, as high risk. These systems must meet stringent requirements for transparency, data protection, human oversight, and non-discrimination.
For Danish schools, this means ensuring that any AI tools used in guidance processes comply with EU standards. The regulation adds accountability to the existing national framework. Schools will need to phase out non-compliant platforms and potentially limit choices to certified alternatives.
Ongoing Debates About Control
Skolelederforeningen has called for 2026 to be the year when educators take control of AI’s influence on children’s learning. This call reflects ongoing tension between embracing helpful technology and preserving essential skills. After 2025 established that AI has become a permanent fixture, educational leaders now focus on structured adoption rather than resistance.
The debate centers on finding the right balance. AI clearly offers efficiency and accessibility benefits, particularly for students facing time-sensitive decisions like application deadlines. However, unchecked adoption risks undermining the deep learning and critical thinking that education aims to develop.
A Personal Take
I think the Aarhus University approach strikes a reasonable balance, but it places significant responsibility on students to use AI wisely. On one hand, giving students reflection tools during stressful deadline periods makes sense, especially when only one in three receives formal AI instruction. The backpack method could democratize access to thoughtful guidance for students without extensive counselor availability. On the other hand, I worry that emphasizing AI use might inadvertently encourage overreliance among young people already comfortable with technology but less experienced with critical evaluation. The warnings about AI’s factual unreliability are crucial, yet in high-pressure moments, students might skip verification steps.
Sources and References
The Danish Dream: Best Universities in Denmark
The Danish Dream: The Student Grants Scheme in Denmark: An Overview
The Danish Dream: Lower Interest in Danish Language Studies is Concerning
The Danish Dream: Best Universities in Denmark for Foreigners
DR: Deadline nærmer sig: Studievejleder opfordrer til at bruge AI til studievalg
Skriv Sikkert: AI og Skole i Danmark
Gymnasieskolen: AI kan aldrig erstatte personlig vejledning
EVA: Elevers brug af AI i gymnasiet








