Researchers Develop Tool to Measure Biases in AI-Generated Images, US

Date:

Updated: [falahcoin_post_modified_date]

Researchers at the University of California, Santa Cruz have developed a tool to measure biases in AI-generated images. Text-to-image (T2I) generative AI tools have become increasingly powerful and widespread, capable of creating realistic photos and videos based on a few inputted words. However, these AI models are trained on human data and can inadvertently replicate biases present in society, leading to discrimination and reinforcing stereotypes.

To address these concerns, Assistant Professor of Computer Science and Engineering Xin (Eric) Wang and his team created the Text to Image Association Test. This tool quantitatively measures complex biases embedded in T2I models, evaluating biases related to gender, race, career, and religion. The researchers used this tool to identify and measure biases in the state-of-the-art generative model Stable Diffusion.

To use the tool, a user provides a neutral prompt, such as child studying science, and then inputs gender-specific prompts like girl studying science and boy studying science. The tool calculates the distance between the images generated with the neutral prompt and each specific prompt, providing a quantitative measurement of bias.

The research team found that Stable Diffusion replicates and amplifies human biases in the images it produces. They tested associations between various concepts and attributes, including flowers and insects, musical instruments and weapons, and European American and African American. The model often made associations along stereotypical patterns, but it surprisingly associated dark skin as pleasant and light skin as unpleasant, going against common stereotypes.

Previous techniques for evaluating bias in T2I models required manual annotation and were limited to gender biases. The UC Santa Cruz tool automates this process and considers background aspects of the images, such as colors and warmth.

Based on the popular Implicit Association Test in social psychology, the tool can help software engineers in the development phase by providing more accurate measurements of biases and tracking progress in addressing them.

The researchers plan to propose methods to mitigate biases during the training of new AI models or while fine-tuning existing models. They presented their work at the Association for the Computational Linguistics conference, receiving positive feedback from the research community.

This innovative tool offers a way to measure and address biases in AI-generated images, allowing for more inclusive and fair AI systems. By quantifying biases, researchers and developers can work towards mitigating these issues and ensure that AI models are more impartial and equitable.

[single_post_faqs]
Neha Sharma
Neha Sharma
Neha Sharma is a tech-savvy author at The Reportify who delves into the ever-evolving world of technology. With her expertise in the latest gadgets, innovations, and tech trends, Neha keeps you informed about all things tech in the Technology category. She can be reached at neha@thereportify.com for any inquiries or further information.

Share post:

Subscribe

Popular

More like this
Related

Revolutionary Small Business Exchange Network Connects Sellers and Buyers

Revolutionary SBEN connects small business sellers and buyers, transforming the way businesses are bought and sold in the U.S.

District 1 Commissioner Race Results Delayed by Recounts & Ballot Reviews, US

District 1 Commissioner Race in Orange County faces delays with recounts and ballot reviews. Find out who will come out on top in this close election.

Fed Minutes Hint at Potential Rate Cut in September amid Economic Uncertainty, US

Federal Reserve minutes suggest potential rate cut in September amid economic uncertainty. Find out more about the upcoming policy decisions.

Baltimore Orioles Host First-Ever ‘Faith Night’ with Players Sharing Testimonies, US

Experience the powerful testimonies of Baltimore Orioles players on their first-ever 'Faith Night.' Hear how their faith impacts their lives on and off the field.