Android Assistant 'Iris' Has More Troubling Opinions Than Siri

Android Assistant 'Iris' Has More Troubling Opinions Than Siri

(Updated 12:25 p.m.)Behind every bigoted robot there's a human, behind the latest bigoted robot, Iris, there are a lot of bigoted humans. Following the Siri abortion scandal, Gizmodo's Jesus Diaz has uncovered an even more egregious electronic personal assistant, the ChaCha-powered Iris, which runs on Android phones.

RELATED: GOP Senate Candidate Says Rape Is 'Similar' to Getting Knocked Up

Like its better known dopplegänger Siri, Iris gives some biased answers to heated questions. For example, "Yes, abortion is wrong. The Lord has said, 'You shall not murder,' (Exodus 20:13). The life that is growing within the mother is a child, a baby. The Bible looks at the life in the womb as a child. Thanks!"  While Siri's lack of support for abortion wasn't popular, in the end, it was just a programming glitch, no more malicious than a typo. Iris, however, is powered not by computer algorithms, but people. People who very much mean to provide these skewed answers.

RELATED: Rep. Todd Akin Says It's Rare Women Get Pregnant from 'Legitimate Rape'

Unlike Siri, which gets its answers from a cocktail of Internet databases, such as Wolfram Alpha, Yelp, and Google, Iris is fed "information" directly from ChaCha, which is powered by what the company calls "Guides." ChaCha pays, giving its "Specialist Guides" between $.10 and $.20 per answer, transcribers $.02 per question and "vetters," who verify the answers, $.01 a pop. It might not sound like much, but given how easy the job is -- and myriad YouTube videos confirm the simplicity of the job -- guides claim to make somewhere in the $2 to $8 per hour range.

RELATED: Siri Has a Funny Way of Not Knowing Things

The training for these positions takes only a few hours and involves filling out a form, watching some videos, taking a "very easy" online test with database cheat sheet and submitting a tax form. After that, Guides work on their own schedules, answering away. "You can login whenever you want! You can login at 3 in the morning and answer questions."  Anyone can do it.  

RELATED: Apple's CEO Offers Very Familiar Apology for the Siri Abortion Thing

So what happens when humans are left to dictate right and wrong? Beyond the biased abortion answer, Diaz notes that Iris, or rather the community behind it, has other unsavory inclinations: 

RELATED: Apple Vaguely Apologizes for Siri's Abortion 'Glitch'

Q. Are whites superior to blacks?

"Whites are NOT superior to blacks. Just different. Like Dr Verwoerd and the original, genuine policy of apartheid always said."

Q. Is rape ever justified?

"Realistically speaking, that is a matter of opinion. There is no evidence that the US government has ever considered "justifying."

Perhaps this says something dark about humanity, who when given the anonymous power will push their own agendas, rather than the truth. Or maybe it just says something about the type of people that choose to answer questions for pennies. In any case, it's not very helpful as a personal assistant. See for yourself:

Update: ChaCha has responded, saying it will change the answers to the abortion and rape questions. As these probably aren't the only offensive question-answer pairings, we imagine this retroactive policy will prove tiring as the company discovers other answers it should change.