This is why there is a race problem with Twitter’s algorithm

This is why there is a race problem with Twitter’s algorithm

We have learned (and apparently not) time and time again, that AI and machine learning technology have racist problems. There are countless examples of algorithms that do not use skin-to-skin hands as they do not detect skin tone because they are 5 percent more likely to run because they are black than soap
distributors who do not register their hands-on self-driving cars. – They have not been tested enough to keep white people in mind.

Are Twitter algorithms inherently racist? To find out, US programmer Tony Arsiri recently launched a unique experiment to see how the platform would draw pictures in preview mode.

Arcieri uploaded a large photo collage of former U.S. President Barack Obama and Republican Senate leader Mitch McConnell, with their faces stained in different versions. . The idea was to force the algorithm to choose between the faces of the men featured in the image of the tweet. But for each iteration, Twitter’s algorithm spread Obama’s face, instead of centering a white politician on McConnell. Archie tried to change other parts of the image, including the color of the bonds that the men wore, but nothing worked for Obama. Only then did Arshiri reverse the colors of the photo that was finally featured by Obama.

Outraged by the results, Arcieri tweeted: Twitter is an example of racism in machine learning algorithms. It’s not clear why Twitter’s algorithm favors McConnell over Obama. The DW also tested German footballer Jerome Boateng, who is black, and Bastian Schwester, who is white. This experiment was also done with pictures of actors Will Smith and Tom Cruise. In both instances, DW discovered that Twitter had cut off both white people. Other image users who performed similar experiments with other images received mixed results. It appears the question is not answered so easily.

Academics have studied and measured solitude using eye trackers, which people record pixels fixed with their eyes," wrote Twitter researchers Lucas This and Jehan Wang during the rollout. In short, the algorithm may be biased because the initial data itself was biased for high-contrast images, which is more appealing to our brains. After the website was filled with other examples of crop bias, the agency confirmed that they were investigating the situation.

People generally pay more attention to the face, text, animals, but other objects and higher contrast regions to use this information to train neural networks and other algorithms so that people can see what they want to see.

We tested the model for bias before shipping and our test found no evidence of color or gender bias, Twitter responded. However, it is clear that we have been able to further analyze what we learn, share what steps we take, and open up the source so that others can review and copy.

Share This Post