NIX Solutions’ reporting: Google weaned AI to distinguish between men and women

As the platform developers said, Google’s Cloud Vision app, which determines what is shown in the photos, will no longer distinguish between men and women. As explained in the company, the gender of a person is not always possible to determine by appearance.

A Google tool called Cloud Vision, designed to identify objects and animated creatures in photographs, will no longer use the gender tags “man” and “woman”, reports ForPost.

According to NIX Solutions, in an email to developers using Cloud Vision, Google said that henceforth, Artificial Intelligence would use the gender-neutral tag “person” to identify a person.

Google clarified that it is not possible to determine a person’s gender only by appearance. In addition, the company referred to its own etiquette, emphasizing that identifying a person’s gender in a photograph can lead to bias against an individual.

After the changes occurrence, Business Insider tested the tool and found that AI really calls all the people in the images exclusively “persons”.

The bias of Artificial Intelligence is widely debated in a modern technological society, which indicates that gaps in AI training can lead to unfair judgments.

Conditionally, if a person who “trains” AI has some prejudices, they will be transferred to a machine that cannot judge impartially.

Mozilla AI Bias Specialist Frederica Kaltheiner called Google’s decision to abandon the photo’s definition of gender “very positive”:

Each time you try to automatically classify people by gender or sexual orientation, you need to decide on the factors, which gives rise to a large number of assumptions. Sorting people as men and women suggests that there are only two genders. Anyone who does not fit under any of them does not automatically pass the classification. So this is not only about bias – the gender of a person cannot be determined by appearance. Any AI system that tries to do this will inevitably fail.