A 10-Year-Old Child Was Told by Amazon Alexa To Electrocute Herself

Andrei Tapalaga

https://img.particlenews.com/image.php?url=3AWfLj_0dfd3TMr00
Amazon Alexa Smart devicePhoto by Lazar Gugleta on Unsplash

Most of the households within America should have an Alexa device inside their home by now, but after hearing this you may want to think twice about owning such a device if you have young children. Last year on December the 29th Kristin Livdahl’s 10 year old asked Alexa for a challenge and this is the response her child received from Alexa:

“Here’s something I found on the web”, Amazon replied, “The challenge is simple: plug in a phone charger about halfway into a wall outlet, then touch a penny to the exposed prongs.”

Amazon services recommended to a 10-year-old girl that she should put a coin against an electrified plug. Luckily her mother was right next to her when the child asked for a challenge. Most children would have attempted the challenge if an adult wasn’t around to stop them and this would have caused certain death.

The incident was posted by Livdahl via Twitter and her tweet of Alexa saying the challenge became viral overnight.

The reason this answer was given out by the Amazon Echo service is because of the way it works. If Alexa does not know an exact response to a question it uses information on the web to gain responses to the questions asked. If Alexa does not know an answer it will end up using anything on the internet that isn’t curated by anyone.

Apparently, the information to respond to the 10-year old’s question was taken from a website called “Our Community Now” representing an organization from Colorado. The website had an article describing different stupid challenges and the algorithm picked it as the best fit to answer the child’s question. The original article also had a disclaimer where it said that readers should NOT attempt any of the challenges mentioned.

Despite how sophisticated Amazon’s algorithms are, they are not able to pick up such information or to make the difference between the voice of an adult and that of a child. Amazon’s Alexa has also been criticized for answering questions with islamophobia and antisemitic conspiracy theories. There are many other cases where racism and antisemitism have been presented in the answers given by Echo or Alexa.

“As soon as we became aware of this error, we took swift action to fix it.”

Amazon mentioned in a statement that the customer’s trust is the most important for them when it comes to their services and the services offered via Alexa. Amazon fixed this error in their system as soon as possible, but how long until someone actually dies from Alexa’s instructions?

“Customer trust is at the center of everything we do and Alexa is designed to provide accurate, relevant, and helpful information to customers,”

Despite the swift fix, this does not really assure that other bad content could not be picked up by a weird question asked by a child. Their algorithm needs to be updated with a filter for the protection of children as well as some sort of curator for all the content that is on the internet when trying to find the answer to a question.

Amazon mentioned that they will train Alexa to answer with more discretion and better understand the requests users give it:

“This training relies in part on supervised machine learning, an industry-standard practice where humans review an extremely small sample of requests to help Alexa understand the correct interpretation of a request and provide the appropriate response in the future.”

It is not just Amazon’s services that you need to look out for. There have been many incidents with Google Assist as well as Apple’s Siri where the answers have been inappropriate, especially for children.

Comments / 109

Published by

✒️Avid Writer with invaluable knowledge that is looking to educate users with the correct information. Looking at valuable historical facts and applying them to today's context. Follow me for more unique perspectives!

N/A
46436 followers

More from Andrei Tapalaga

Comments / 0