When a blogger asked his readers to see how Siri responded to questions about abortion clinics, he had no idea what he was starting. Siri's unhelpful answers to questions about where to go for an abortion led to widespread condemnation, forcing Apple to eventually clarify the matter as a mere oversight that would be fixed. Siri, after all, is in beta. In any case, if you ask it specifically for things like Planned Parenthood, the answers tend to be a little better.
In 2009 a couple of people -- one white, one black -- did a hands-on test of the face-tracking technology in one of the company's webcams and posted the results onYouTube. Guess which person the camera had trouble following? After being accused of making racist webcams, HP said it was not at all intentional and that the camera software was based on standard algorithms. The company suggested optimizing the webcam with software from its website to fix the problem. If only we could do the same to cab drivers.
When Avatar broke box office records worldwide, 3D became the technology of the moment. The moment has since passed, and many people are thankful, since the current crop of 3D tech tends to cause dizziness or nausea in many individuals, which some health organizations say could be up to 40% of adults. It's probably a good thing 3D didn't become the norm, or the floors of movie theaters might be a lot messier today.
Back in 2003, officials at the Office of Affirmative Action in Los Angeles requested that contractors and suppliers stop using the term "master" and "slave" on computer equipment -- terms that have been standard for decades -- in the interest of "cultural diversity." After a tirade of negative feedback, the office backed off, saying the request was just that, and not an ultimatum. One of the suggested replacement terms, Batman/Robin, sadly never gained traction.
Beyond the database issues that fueled the Siri-abortion controversy, voice-recognition technologies have always had issues hearing certain voices, particularly those of women. Regional accents and dialects are a problem, too. Some people believe the solution is to simply train those people to speak clearer and louder, toward the microphone, while others believe the technology should be able to adapt to softer voices and accents. And the debate continues...
When it was discovered that the new voice assistant on the iPhone 4S, Siri, didn’t identify nearby abortion clinics when asked, it created a firestorm of controversy. After the story circulated widely (with even the NARAL Pro-Choice America Foundation chiming in on the issue), Apple spoke out, saying that the omission was completely unintentional and that it would be amended in a future update.
It’s far from the first time technology has displayed inadvertent insensitivity to social issues or politics. Gadgets, software, and equipment are only as perfect as the humans who made them, after all, and if history is any indication, that’s pretty imperfect.
From webcams that don’t recognize people with a certain skin tone to video technologies that make certain people sick, technology sometimes stumbles upon politically incorrect landmines in its at-times clumsy march toward the uncharted future. Engineers can’t predict every possible use of a piece of software or hardware, and sometimes feelings are hurt in what they exclude, typically not by design.
Siri accidentally wading into the abortion debate is just the latest chapter in this saga. It won’t be the last.
Do you know of a case of technology inadvertently behaving badly that we missed? Let us know in the comments.
ConversionConversion EmoticonEmoticon