AI programs showing racial and gender biases


According to the scientists an artificial intelligence tool that has designed to develop the ability of computers to evaluate and interpret everyday language has been able to show remarkable gender and racial biases.

In our everyday lives, most of the decisions have been started to be taken automatically even by the automatons rather than by us or by other people. This automation is increasing the number of decisions made without human emotions and causes the increase of social inequalities and prejudices. On the contrary of this common belief researches show that machine learning techniques reveal the racial and gender biases of the people rather than causing racial and gender discrimination.  The ability of programs such as Google Translate to interpret language has improved in the past few years. These are the result of new machine learning techniques and the availability of vast amounts of online text data, on which the algorithms can be trained. But, the latest researches show that as machines are getting closer to acquiring human-like language abilities, they are also taking the deeply ingrained biases hidden within the patterns of language use.

According to the scientists of the University of Bath the last researches are revealing the human beings’ being still prejudiced due to the fact that AI is learning it from people.

Sandra Wachter, a researcher in data ethics and algorithms at the University of Oxford stated that the world is biased, the historical data is biased, thus it is not surprising that we receive biased results. According to her, rather than algorithms representing a threat, they could present an opportunity to address bias and counteract it where appropriate.

Although it is a very embarrassing and complicated situation for humanity, we cannot really shy away and escape from it. This fact revealed by the machine learning devices may be helpful to determine such racial and gender biases and to reduce and even destroy such feelings.

 

What's Your Reaction?

Cry Cry
0
Cry
Cute Cute
0
Cute
Damn Damn
0
Damn
Dislike Dislike
0
Dislike
Lol Lol
0
Lol
Like Like
0
Like
Love Love
0
Love
Win Win
0
Win
WTF WTF
0
WTF

Comments 0

Your email address will not be published. Required fields are marked *

AI programs showing racial and gender biases

log in

Become a part of our community!

reset password

Back to
log in