BIG BLUE IBM reportedly used nearly a million photos from image sharing site Flickr to train its facial recognition tech, but it did so without user consent.
If you read the INQUIRER, you'll know that such surreptitious photo gobbling is a no-no, and using photos for image recognition tech behind a person's back, as IBM did according to NBC, is a pretty good example of why.
Photographers of the people in the Flickr photos may have had permission to use the subject's images, but there was no clear line of consent that allowed their photos to be used to train IBM's smart software.
The problem appears to stem from the photos being taken from the YFCC100M collection, which comprises a suite of some 99.2 million photos. It was compiled by Yahoo, which once owned Flickr, and used to carry out research.
The photos stored in the collection were kept under the Creative Commons license, which with a few limitations allows for such pics to be re-used willy-nilly.
Using Creative Commons licensed pics to train image recognition tech is, thus far, legit and Big Blue doesn't appear to have breached any rules. However, not telling folks that pics of their grinning or pouting mugs could be used to train software that might one day help killer robots see, is an eyebrow-raiser in the morality stakes.
Had people know that their photos might be used to train smart software, they may have been reticent to hand over their consent, as they could argue that facial recognition software trained to spot their mug is not only an invasion of privacy but could also lead them to them getting snooped on further on down the line.
In a statement given to INQ, Big Blue said that takes privacy seriously, that the database can only be accessed by verified researcher and that Flickr users with their photos in the database can opt out of the dataset.
"We take the privacy of individuals very seriously and have taken great care to comply with privacy principles, including limiting the Diversity in Faces dataset to publicly available image annotations and limiting the access of the dataset to verified researchers," a spokesperson said.
"Individuals can opt-out of this dataset. IBM has been committed to building responsible, fair and trusted technologies for more than a century and believes it is critical to strive for fairness and accuracy in facial recognition."
That seems fair enough, but the whole thing does indicate that while uploading pics and other information onto web services is good for posterity and showing off, there's a chance that your pics could end up in unexpected hands and help train murderous drones or more realistically spying tech. µ
Much a (dil)do about nothing
Neither the time nor the face
The tiny tweaks are coming thick and fast now
Gitting more secure