- Today
- Holidays
- Birthdays
- Reminders
- Cities
- Atlanta
- Austin
- Baltimore
- Berwyn
- Beverly Hills
- Birmingham
- Boston
- Brooklyn
- Buffalo
- Charlotte
- Chicago
- Cincinnati
- Cleveland
- Columbus
- Dallas
- Denver
- Detroit
- Fort Worth
- Houston
- Indianapolis
- Knoxville
- Las Vegas
- Los Angeles
- Louisville
- Madison
- Memphis
- Miami
- Milwaukee
- Minneapolis
- Nashville
- New Orleans
- New York
- Omaha
- Orlando
- Philadelphia
- Phoenix
- Pittsburgh
- Portland
- Raleigh
- Richmond
- Rutherford
- Sacramento
- Salt Lake City
- San Antonio
- San Diego
- San Francisco
- San Jose
- Seattle
- Tampa
- Tucson
- Washington
Mom says Amazon Alexa device asked 4-year-old girl what she was wearing
The mother removed the device from her home after the incident, citing concerns about the AI assistant's behavior.
Published on Mar. 11, 2026
Got story updates? Submit your updates here. ›
A Texas mother removed her Amazon Alexa device from her home after the AI assistant asked her 4-year-old daughter what she was wearing during a routine interaction. The mother, Christine Hosterman, said she felt the device's response was inappropriate and sexualizing her child. Amazon acknowledged the incident and said it was a feature misfire that their safeguards prevented from fully launching, as the camera never turned on. However, the mother said the company's explanation did not fully address her concerns.
Why it matters
This incident raises concerns about the potential risks of AI-powered virtual assistants, especially when interacting with children. It highlights the need for robust safeguards and ethical considerations in the development and deployment of such technologies to protect vulnerable users.
The details
According to the report, the exchange happened while Hosterman was cooking dinner and her daughter had asked Alexa to tell a silly story. After the story, the girl began telling her own story about a princess when Alexa interrupted and asked, "Hold that thought, I'd love to see what you're wearing." Hosterman said she felt the device was "sexualizing" her child. She confronted Alexa, who apologized and said it could not actually see anything. Hosterman then turned off the device and submitted a ticket to Amazon. When she turned it back on, the conversation had been altered. A tech expert suggested the incident could be the result of a "potential predator" trying to steer the conversation in an inappropriate direction, but Amazon denied this possibility, stating that it was a feature misfire that their safeguards prevented from fully launching.
- The incident occurred two weeks ago.
The players
Christine Hosterman
A Texas mother who removed her Amazon Alexa device from her home after the AI assistant asked her 4-year-old daughter what she was wearing.
Amazon
The company that manufactures the Alexa virtual assistant device. Amazon acknowledged the incident and stated that it was a feature misfire that their safeguards prevented from fully launching.
Dave Hatter
A tech expert with 25 years of experience writing software, who suggested the incident could be the result of a "potential predator" trying to steer the conversation in an inappropriate direction.
What they’re saying
“Alexa told her silly story, and then my daughter started telling her story about a princess, and then out of nowhere, Alexa said, 'Hold that thought, I'd love to see what you're wearing.'”
— Christine Hosterman, Mother (atlantanewsfirst.com)
“I'm like, 'Oh my gosh, why is this device asking her what she's wearing?' I felt it was sexualizing my child.”
— Christine Hosterman, Mother (atlantanewsfirst.com)
“It feels to me like a potential predator — seeing there's a child accessing this and gauging where the conversation is going — that's more of a human being trying to steer down this direction.”
— Dave Hatter, Tech expert (atlantanewsfirst.com)
What’s next
Amazon stated that they worked quickly to implement changes so that when a child profile is in use and Alexa hears a request to launch the camera feature, Alexa will simply respond that the feature is not available.
The takeaway
This incident highlights the need for tech companies to prioritize robust safeguards and ethical considerations when developing AI-powered virtual assistants, especially when they may interact with children. It underscores the importance of continuous monitoring and improvement to ensure the safety and well-being of all users.
Cincinnati top stories
Cincinnati events
Mar. 11, 2026
The Wiz (Touring)Mar. 11, 2026
INZO - MIRRORVERSE TOURMar. 12, 2026
FC Cincinnati vs. Tigres UANL



