Tech

Flickr’s new tag system spewing out offensive labels

This robot isn’t racist — but it does need to be educated.

That’s what photo-sharing site Flickr said after users complained that a new automated tagging system has given a slew of offensive labels to photos of people and places.

The Yahoo-owned site’s “image recognition technology,” rolled out May 7, identified a black man as an “ape” and an “animal,” although it also gave the “ape” designation to a blond, white woman in facepaint.

Elsewhere, the metal bars outside Nazi concentration camp Dachau were labeled “jungle gym,” while the railroad tracks leading into Auschwitz were tagged “sport.”

In a statement, Flickr said it was preparing a fix for the inflammatory glitches. It noted that the robot algorithm learns from its mistakes as users delete faulty tags — all of which are generated by computers rather than people.

“While we are very proud of this advanced image-recognition technology, we’re the first to admit there will be mistakes and we are constantly working to improve the experience,” a Flickr spokesman said.

Flickr had anticipated problems when it announced the new feature earlier this month — albeit not the kind that would offend and outrage users.

“Usually, you can tell why a mistake was made (sometimes a bike looks like a motorcycle), but occasionally, it may be baffling (no, your grandma doesn’t look like a cat!),” Flickr employee Andrew Stadlen wrote.

The Flickr fracas follows revelations earlier this week that the White House turned up in Google Maps searches of the word “house” coupled with the n-word.