Site navigation

Microsoft Creating Tool to Weed Out AI Bias

Dominique Adams

,

AI Bias

According to the MIT Technology Review Microsoft is working on a new tool to automate the process of detecting bias algorithms in AI.

Artificial Intelligence (AI) is only as good as the data that is fed into it, sadly humans are failing machine-learning models by feeding it with society’s own prejudices, which has resulted in AI bias that then discriminates against certain groups.

The goal is to help businesses use AI without running the risk of discriminating against certain groups of people. Rich Caruana, a senior researcher on the bias-detection tool at Microsoft, described it as a “dashboard” that engineers can apply to trained AI models.

He said of the tool: “Of course, we can’t expect perfection—there’s always going to be some bias undetected or that can’t be eliminated—the goal is to do as well as we can.” Caruna added: “The most important thing companies can do right now is educate their workforce so that they’re aware of the myriad ways in which bias can arise and manifest itself and create tools to make models easier to understand and bias easier to detect.”

The Dangers of AI Bias

As the use of AI in decision making proliferates, so too does the potential damage AI can inflict if the bias goes undetected. For example, AI bias could significantly impact decision outcomes in the health care and judicial system, as these choices are often based upon AI’s predictive capability. Companies such as Northpointe, which uses Compas machine-learning software, employs AI to predict the likelihood a defendant will reoffend, however, because of AI bias it was found that the tool was judging white offenders more favourably than black offenders.

Similarly, Boston University research found that AI could be gender biased in terms of semantic connections, men were more likely to be connected to the word programmer than women. While a study by MIT’s Media Lab revealed that facial recognition algorithms are 12% more likely to misidentify dark-skinned males than light-skinned males.

Facebook Already Testing its Detection Tool

Microsoft is not along in developing a bias detection tool, in May Facebook announced it was testing its Fairness Flow tool, which automatically issues an alert if an algorithm is making an unfair judgement about a person based on race, gender or age. Mark Zuckerberg said it was a necessary tool as the company’s employees are becoming increasingly dependent upon AI for decision making.

Bin Yu, a professor at UC Berkeley said that while automated detection was a step in the right direction more still needed to be done. Yu recommended that large companies should hire in outside companies to audit their algorithms to prove they are not biased. In regard to Facebook, Yu asserted that “Someone else has to investigate Facebook’s algorithms—they can’t be a secret to everyone.”

Dominique Profile Picture

Dominique Adams

Staff Writer, DIGIT

Latest News

%d bloggers like this: