Syracuse University’s S.I. Newhouse School of Public Communications is now accepting nominations for the 14th annual Mirror Awards competition honoring excellence in media industry reporting. The deadline is Feb. 10, 2020. Anyone may nominate, and there is no fee to enter. Entries…
Unless Designers and Users Intervene, Expect More Missteps in an AI World
A recent report by the United Nations Educational, Scientific and Cultural Organization (UNESCO) paints a troubling future for artificial intelligence in terms of promoting dominant gender norms. UNESCO’s report maintains digital assistants like Alexa and Siri create a model of “docile and eager-to-please helpers” that reinforce “commonly held gender biases that women are subservient and tolerant of poor treatment.” Maxwell School Professor Jamie Winders, director of the Autonomous Systems Policy Institute (ASPI), says despite the report, we can expect more of the same unless the public demands changes.
“One of the things that decades of research in the social sciences and humanities has shown us is that there is a two-way street between identity categories like gender and race and the material things that fill our daily lives,” says Winders. “Everything from the furniture in our homes and paint color on our walls to the kinds of wallets and shoes we wear is gendered. In the 21st century, those things that fill our daily lives now include virtual assistants like Alexa or Siri. These assistants are meant to behave like human assistants, so it’s not surprising that they mimic dominant gender norms. By definition, these assistants are meant to be helpful, and around the world, ‘helpful,’ assisting roles are strongly associated with women.”
Winders says in the case of these virtual assistants that gendered assumption is built into their design. “Although you can opt for a ‘male’ voice for most virtual assistants, the default remains a female voice–in the same way that the ‘default’ assumptions about who makes the best personal assistant point to women,” says Winders.
Syracuse University’s ASPI was created in part to research this very issue. “As we see more and more AI-driven products attempting to mimic human behavior, those products will continue to reflect and reinforce dominant gender norms, unless we, as designers and users, intentionally intervene,” according to Winders. “Without a strong focus on representation (whose voices, images, accents, etc. are being used and not used) at ‘every’ step from design to regulation to daily use in the life cycle of these ‘smart’ technologies, we will continue to see these missteps.”