Home  >  Article  >  Hardware Tutorial  >  New training approach aims to reduce social bias in AI

New training approach aims to reduce social bias in AI

王林
王林Original
2024-06-27 08:29:45492browse

New training approach aims to reduce social bias in AI

AI chatbots have been known for generating racially-prejudiced solutions quite often when subjected to various questions, and many research efforts have been targeting this problem. Now, a new training method is ready to iron out this issue. The method is known as "fair deduplication" or just "FairDeDup" and comes as the result of research conducted by a team from Adobe and OSU College of Engineering's doctoral student Eric Slyman.

Deduplicating data sets used in AI training consists of removing redundant information, thus lowering the costs of the whole process. For now, the data used comes from all over the internet, so it contains unfair or biased ideas and behaviors that humans often come up with and share online.

According to Slyman, "FairDeDup removes redundant data while incorporating controllable, human-defined dimensions of diversity to mitigate biases. Our approach enables AI training that is not only cost-effective and accurate but also more fair." The list of biased approaches perpetuated by AI chatbots these days includes occupation, race, or gender, but also age, geography, and culture-related ideas that are obviously unfair.

FairDeDup is an improved version of an earlier method known as SemDeDup, which often exacerbated social biases, although it proved to be a cost-effective solution. Those interested in this field should grab Kris Hermans' Mastering AI Model Training: A Comprehensive Guide To Become An Expert In Training AI Models, which is currently available on Kindle for $9.99 or in paperback version (for $44.07).

Working For NotebookcheckAre you a techie who knows how to write? Then join our Team! Wanted:- News WriterDetails here

The above is the detailed content of New training approach aims to reduce social bias in AI. For more information, please follow other related articles on the PHP Chinese website!

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn