What are the navigation apps like Waze, music reference services like Pandora or digital assistants such as Siri, you have used artifact intelligence in your everyday life.

“Today 85 percent of Americans use AI daily,” said Tess Posner, CEO of AI4ALL.

AI has also been touted as a new must for business, for all of its customer service for marketing for IT. However, for all uses, AI also has a dark part. In some cases, algorithm trends.

The most common example, such as Google’s facial recognition tool, produces a black face as a gorilla or an algorithm used by law enforcement to predict recidivism in unbalance to color people.

Even more subtle.

During Beauty.AI organizes an online competition that uses algorithms on a regular basis, the majority of the “winners” shine.

Search Google for images of “unprofessional hair” and the results that will be seen for many black women photos (even looking for “people” or “women” who make images of white people).

While there are still many findings highlighted in this issue, there is no problem faced enough in the wider field of public technology, especially in research at universities or government agencies and law enforcement who run the AI.

“Results, bias, yen not addressed, become Achilles helicopters” which ultimately kill intelligence, “said Chad Steelberg, CEO of Veritone.”

You can not have a machine where the assumptions and suggestions of the world have been printed in a way that makes the decision non-actionable.

From a basic economic perspective and trust if you want AI to become strong components for the future, you should address this problem. ”

As artificial intelligence becomes more prevalent in our lives every day, there are now small communities but many entrepreneurs, data scientists and researchers who deal with biased issues in AI. I would say some people to learn more about the challenges and solutions that can be done.

Cathy O’Neil, the founder of O’Neil Risk Consultancy & Audit Algorithm

In early 2010, Cathy O’Neil worked as an expert in advertising technology, a building algorithm that defines what happens to ad users while searching the web.

Inputs for calculus include information that does not correspond to what the user uses or what type of computer is owned.

However, O’Neil realized he actually created a user’s demographic profile. If there are sex and no explicit input race, O’Neil’s algorithm differentiates to certain background users, as per the other instructions.

When O’Neil started talking to colleagues in another industry, he found this as the standard of practice.

This biased algorithm not only determines what the user represents, but is also a more important decision, as described or what people will be approved for credit cards. (The observation was first reviewed and endorsed by O’Neil and so on.)

In addition, in some industries – for example, if humans need to make decisions based on specific specific specificities, it is likely to be illegal because of anti-discrimination laws.

However, since the algorithm determines, gender and race are not explicitly as a factor, it is considered that the decision is unfair.

“I have left [world] financially because I want to do better than take advantage of the system just because I can,” O’Neil said.

“I’m typing this uninformed knowledge data, I just think of this way rich financial means have been done, but people are still thinking of all that’s good in 2012. Those who make the world better.”

O’Neil walked away from his project. They write a book, Mathematical Destruction: How Big Data Increases Inequality and Threatened Democracy, about the dangers of releasing open world algorithms, and start consultations.

Overall, he looks special: audit algorithm.

“I must confess that it may not be possible 2014 or 2015 that I see this as a business opportunity,” said O’Neil.

Prior to the 2016 elections, the fact was that he came across O’Neil Risk Consulting & Auditing Algorithmic (ORCAA).

“I started because I thought that even those who wanted to perform unfair practices would not know how to do it,” said O’Neil. “I do not know, I have no good advice to give them.” However, they want to know.

So, what is meant for an audit algorithm?

“The highest level of response to what is meant to improve our definition is what is meaningful for the algorithm,” O’Neil said.

 

LEAVE A REPLY

Please enter your comment!
Please enter your name here