Home > Article > Technology peripherals > Meta open-sources FACET tool for assessing racial and gender bias in AI models
News on September 2, Meta Company recently launched a new AI tool called FACET to identify race in computer vision systems in order to alleviate the problem of systemic bias against women and people of color in many current computer vision models. and gender bias.
The FACET tool is currently trained on 30,000 images, including images of 50,000 people. It has especially enhanced the perception of gender and skin color and can be used to evaluate computer vision models on various features.
The FACET tool can be trained to answer complex questions, such as identifying a male as a skateboarder, as well as light and dark skin.
Meta used FACET to evaluate the DINOv2 model and SEERv2 model developed by the company, as well as OpenAI's OpenCLIP model. Overall, OpenCLIP performed better than other models in terms of gender, while DINOv performed better in age and skin color. Good judgment.
Open sourcing
FACET will help researchers perform similar benchmarking to understand biases in their own models and monitor the impact of mitigation measures taken to address equity issues. IT House attaches the Meta press release address here, and interested users can read it in depth.
[Source: IT Home]
The above is the detailed content of Meta open-sources FACET tool for assessing racial and gender bias in AI models. For more information, please follow other related articles on the PHP Chinese website!