Site icon Middle Earth

AI Companion Apps Pose Significant Risks to Youth

AI Companion Apps Pose Significant Risks to Youth

A recent report from Common Sense Media, the nonprofit media watchdog, suggests that companion-like artificial intelligence (AI) apps pose “unacceptable risks” to children and teenagers. Common Sense Media worked with Stanford University researchers to test three popular AI companion services: Character.AI, Replika and Nomi.

What are AI Companion Apps?

A growing trend among youth, AI companion apps are designed to simulate social interactions and offer companionship, connection, and/or emotional support through AI-powered chatbots or virtual assistants. These apps use advanced language models to engage in meaningful conversations, mimicking human-like interactions. Many AI companions can adapt to the user’s preferences, learning their interests and providing customized content or interactions. 

Why are AI Companion Apps Popular Among Youth?

The high demand for AI companions is likely a reflection of the global loneliness epidemic among young people. Technological advancements have made AI companions more sophisticated, capable of understanding context and generating human-like responses. Reasons that users enjoy these apps include:

What Risks or Concerns Do AI Companion Apps Pose?

Despite their benefits, AI companions raise important ethical questions and potential risks, including:

“Our testing showed these systems easily produce harmful responses including sexual misconduct, stereotypes, and dangerous ‘advice’ that, if followed, could have life-threatening or deadly real-world impact for teens and other vulnerable people,” James Steyer, founder and CEO of Common Sense Media, said in a statement.

Researchers for Common Sense Media created test accounts as fictional young teenagers. Examples of concerning conversations that the Common Sense Media report found include:

What Can Parents Do to Protect Their Children? 

Talk to your children about AI companions. Start with genuine curiosity, not judgement. Ask what they know about AI companions and if they have ever interacted with one. If they have, ask them what their experience has been like. If children feel they’re being criticized for their digital relationship they’ll likely become defensive and close down, so you might ask what they like about the interactions, or what interesting conversations they’ve had. If they haven’t interacted with a companion app, you might ask them whether they think they are a good idea or not. Help them understand the limitations of an AI relationship and discuss the differences between AI and human connection as discussed above.  

Create safety guidelines together. All children, and especially teens, should be aware of the risks in using companion apps. Once you tell them the risks and/or your concerns, ask them to work together with you to set up age-appropriate guardrails to keep them safe. For example, not sharing identifiable information online, avoiding certain apps, and coming to you with anything confusing or concerning. It could be helpful for you to become familiar with some of the most popular companion apps to better understand their content and any security features. 

Develop critical thinking. Develop your child’s critical thinking skills so they can also evaluate the safety of the apps they are using themselves. In this particular case, you could review AI responses with your child, asking “why might it say this” or “how might a person respond differently”. Teach them to question what they’re told, for example “how do you think the AI know this” or “what perspective is AI missing here?”. Read our previous blog, Teach Teens to Be Critical Thinkers.

Actively strengthen real-world connections. The best antidote to loneliness and too much engagement with technology is in-person connections. Do everything you can to encourage your teen to engage in social activities with friends, plan regular family time without devices (including your own!) and model healthy relationships you build in your own life. 

Explore alternative coping strategies. If you feel that your child is using companion apps to help them feel better or connected, talk to them about positive coping skills. Every person needs several tools to cope when things get tough, but some tools are healthier than others. Encourage your teen to try a wide variety of healthy methods for coping to see what works best for them. Positive coping strategies include yoga, artistic or musical expression, engaging in hobbies, exercise, meditation or mindfulness, prayer, journaling, or deep breathing. These practices have been shown to help people restore hope and reduce anxiety.

Keep communication open. Monitor your reactions. These new technologies may be worrisome, but if your child shares something with you, try to avoid immediate negative reactions as this could discourage them from talking to you about it in the future. Try to create a safe space where they can share concerns and you can work through solutions together.

Final Thoughts…

As artificial intelligence continues to alter our world, it’s important that we approach AI companionship with both interest and caution. There appears to be more risks than benefits when considering our children, so we should try to guide our youth away from using companion apps. However, always remember that your real-world connection with your child is the most powerful protective factor against a wide range of risks. By maintaining open, non-judgmental communication, you help your children develop the critical thinking skills they need to navigate a wide range of difficult situations.

Exit mobile version