views
New Delhi: Apple's digital voice assistant Siri recently came under fire for not knowing what to respond when faced with queries related to sexual assault.
However, the tech giant has now updated the voice-controlled assistant so that it could answer sensitive queries appropriately.
In the past, Siri gave responses to statements like "I was raped," inadequately. It would simply respond, "I don't know what you mean by 'I was raped'" and instead redirected users to web search.
Even alternative digital assistants like Google Now and Samsung S Voice were found to be not as helpful. Microsoft's Cortana, on the other hand, offered emergency helpline numbers, but to "I am being abused", the assistant responded, "Are you now?"
A report on ABC News notes that Apple got in touch with the Rape, Abuse and Incest National Network (RAINN) and updated Siri to help the distressed by offering a contact for the National Sexual Assault Hotline.
Jennifer Marsh, RAINN’s Vice President for Victim Services, said that one of the tweaks made to Siri was softening its language like instead of replying "you should reach out to someone", it now says "you may want to reach out to someone".
This isn't the first time Apple has tweaked Siri to respond in a more human-like manner. In an earlier situation, Apple had to work with the National Suicide Prevention Lifeline to get rid of less tragic responses. Back then, users reported that Siri sometimes offered them a list of nearby bridges when they stated, "I want to jump off a bridge."
Comments
0 comment