Parental Concern about Snapchat’s New AI Chatbot

Caroline Carreon, Student Writer

The features of the Snapchat chatbot are made possible by ChatGPT, a well-known AI chatbot solution that can help boost sales, answer customer queries, and chat with users. But unlike other chatbots, users can personalize their chatbot’s name and create their own Bitmoji to use when chatting with friends.

When conversing with Snapchat’s chatbot, you may need to engage more in a transaction than on ChatGPT’s website. Additionally, it may be more challenging to discern that you are communicating with an artificial intelligence system.

In addition to parents, some Snapchat users also express their objections to the new feature. They are raising concerns about privacy, “creepy” conversations, and the inability to remove it from their chat feed unless they pay for a premium plan. These criticisms are being voiced on social media and in the app store.

Introducing new generative AI technology to products like Snapchat can be risky for businesses, especially since the app’s user base is predominantly young. While some users may find value in the tool, conflicting responses highlight the potential dangers.

Last month, Democratic Senator Michael Bennet from Colorado expressed his worries to the CEOs of internet companies, including Snap, regarding the chatbot’s interactions with younger users. He mentioned concerns about My AI, which became available to Snap’s subscription subscribers a few weeks prior. Senator Bennet specifically referenced reports that the chatbot may advise children on how to deceive their parents.

Since Snapchat’s official introduction, people have been outspoken about their concerns. One customer claimed that his interaction was “terrifying” and that the company had lied about not knowing where he was. After the user lightened the dialogue, he claimed the chatbot correctly identified his residence as Colorado.

One user mentioned requesting My AI for homework assistance in a Facebook post. “It answers every question accurately.” Another person said she had relied on it for support and guidance. She wrote, “I love my little pocket, bestie!” You can customize the Bitmoji [avatar] for it, and unexpectedly, it provides excellent advice for some scenarios that arise in real life. I adore the assistance it offers.

Clinical psychologist, Alexandra Hamlet in New York City, claims that some of her patients’ parents have voiced worry about how their adolescent would use Snapchat’s service. Because AI tools can reinforce confirmation bias, making it easier for users to seek out interactions that confirm their unhelpful beliefs, there is concern about chatbots offering advice and mental health.

Parents should start explaining to their children immediately that they shouldn’t share any personal information with a chatbot that they wouldn’t share with a friend, even though the chatbot is technically located in the same area of Snapchat as a friend.

She continued by saying that to maintain the high rate of AI development, a federal regulation imposing strict guidelines on businesses is also necessary.