Correcting the bias of face recognition technology – establishing an inclusive recognition model

siteadmin | security

There are shortcomings in face recognition technology, especially for people of color or gender. In order to create a more fair face recognition system, Fb’s recent AI team announced a fairer algorithm in the F8 sharing conference. In addition, the team also found that the industry found photos and colors of white males when looking for photos to train the machine. The proportion of photos of races and women is not 1:1, which also causes artificial intelligence to deviate in the recognition of human faces, which also led to serious deviations in Amazon Rekognition.

Fb’s announcement also represents the “discrimination” problem that technology can solve face recognition with technology, and people do not need to be excluded because of systematic deviations, and even fear the use of technology.