very real problems with artificial intelligence today, which may already be exacerbating inequality in the workplace, at home and in our legal and judicial systems. Sexism, racism and other forms of discrimination are being built into the machine-learning algorithms that underlie the technology behind many “intelligent” systems that shape how we are categorized and advertised to.She included this as part of her anecdotal evidence:
Nikon’s camera software...misread images of Asian people as blinking...What the hell is this? Does Crawford seriously think Nikon is a technologically self-hating company?
The company is based in Tokyo. The president is Kazuo Ushida:
No doubt, as artificial intelligence programs are developed they will have to be tweaked in many ways.
But for Crawford it is all about straight white male superiority--despite her using one example coming from a company based in Asia with an Asian president.
Sexism, racism and other forms of discrimination are being built into the machine-learning algorithms that underlie the technology behind many “intelligent” systems that shape how we are categorized and advertised to.
Currently the loudest voices debating the potential dangers of super intelligence are affluent white men, and, perhaps for them, the biggest threat is the rise of an artificially intelligent apex predator.
But for those who already face marginalization or bias, the threats are here.Yes, of course, artificial intelligence software glitches always go in favor of straight white male superiority and against "marginalized groups," according to Crawford. We would never see, for example, an artificial intelligence virtual assistant glitch that accidentally provides us with a transgender assistant.
In the same edition of the white gray lady, Henry Alford reports on his experience using a virtual assistant:
I recently used a virtual assistant named Amy for 10 days....Bottom line: It is really a stretch to think there are built-in in artificial intelligence glitches that go in only one direction.The real pre-programming problem is in the Kate Crawford-types who see at all times sexism and racism in the most bizarre places and only sexism and racism.
Hailing from a New York start-up called x.ai, Amy...will set up meetings for you. Once someone has agreed to meet with you at a certain place, you cc Amy, and independently of you she’ll go back and forth with the other person to determine a mutually convenient time, and then help you to put that time on your calendar....
Allowing someone to do your vetting requires trust. I applaud x.ai for including, at the bottom of each of Amy’s emails, the information that Amy Ingram is a form of artificial intelligence...
The strangest moment I’ve had with Amy, though, came when I had her set up an appointment for a phone interview with an x.ai employee who also uses the company’s virtual assistant — in this case, Andrew — for making appointments.
After Amy and Andrew had set up the appointment, I asked Amy why I didn’t see the appointment on my calendar; strangely, she wrote back as Andrew. I thought, not only is my assistant invisible, unpredictable, occasionally moody, and incorrigible — she is also trans.