Are Teenagers More Comfortable Confiding in AI? IPB University Experts Warn of the Risks
The phenomenon of increasing numbers of teenagers who prefer to confide in artificial intelligence (AI) rather than their parents, friends, or teachers has become a concern for academics at IPB University.
The chairperson of the Center for Gender and Child Studies (PKGA) at IPB University, Dr Yulina Eva Riany, emphasized that this trend needs to be understood critically because it presents both opportunities and challenges for adolescent development.
“For teenagers, AI is considered neutral and non-judgmental. They feel safer expressing their feelings without fear of being scolded, blamed, or ridiculed,” explained Dr Yulina. She added that the 24-hour availability of AI makes it seem like a ‘natural virtual friend’ for Generations Z and Alpha.
However, behind the positive side as an emotional outlet, this phenomenon reflects a communication gap between adolescents and their parents and social networks.
According to Dr Yulina, if not managed properly, confiding in AI can pose serious risks, ranging from data privacy to psychological impacts.
“Teenagers may experience personal data leaks because their interactions are stored on the AI service provider’s servers,” she said.
Additionally, emotional dependence on AI has the potential to hinder teenagers’ social skills. AI, which always provides instant responses, makes teenagers less likely to learn how to manage frustration, wait, or negotiate with others.
“Empathy skills, reading facial expressions, and real communication can be reduced if all confessions are replaced by AI,” she added.
She emphasized that the role of parents and schools is very important. Parents need to build two-way communication, listen without judging, and provide digital literacy about the risks of sharing personal data.
“Occasionally ask your child, in a supportive manner, what they talk about with AI. This active guidance is crucial so that teenagers do not take the wrong steps,” said Dr Yulina.
For schools, the integration of digital and emotional literacy into the curriculum is an urgent need. Guidance counselors also need to understand this phenomenon so that teenagers remain comfortable talking to humans.
“Schools can establish a peer support system, which is a group of trained peers to listen. This way, teenagers do not solely rely on AI,” she explained.
The phenomenon of teenagers confiding in AI, continued Dr Yulina, should be used as an opportunity to strengthen healthy communication within families and school environments. Finally, she suggested that AI platforms should implement strict content moderation, data transparency, and automatic safeguards to respond to harmful keywords.
“AI should be positioned as a companion, not a replacement for psychologists or counselors. AI is merely a tool, not a substitute for human relationships,” she concluded. (dr) (IAAS/KQA)

