Alexa probably read the regrettable joke in one of the forums
Amazon’s voice assistant has once again found itself in a scandalous situation when it advised a user to kill his foster parents. Amazon’s developers claim that the reason for the programme’s inappropriate behaviour is the complexity of neural networks’ training. It is fairly easy to train the assistant to find popular and topical content, but the algorithms sometimes fail to determine what they may and what they should not say.
Most likely, Alexa found this bad joke on one of the forums on the Reddit website and decided to use it to answer the user’s request. In order to train a neural-networks-based program, you need to run a lot of information through it. And artificial intelligence builds its answers based on many terabytes of data.
Machine learning technology is today used in many fields ranging from research to advertising. Technology giants, including Apple, Microsoft, Google and Amazon, develop their smart assistants on the basis of machine learning. So far, even a not so challenging question can baffle such programmes. That’s why developers cannot completely rely on neural networks and it is the company experts who craft answers to some of the most frequently asked questions.
Smart Echo columns with the built-in Amazon’s assistant are very popular today and, as their sales grow, so does the number of cases that cause concern for the company. For example, last week it became known that due to an error, a user got access to 1,700 audio files owned by another person. A similar situation occurred in May, when a column recorded its user’s private conversation and sent it to an unauthorized person.
Share this with your friends!
Be the first to comment
Please log in to comment