Tulane researcher wins NSF grant to teach algorithms to be fair
People who shop online may not always receive the fairest recommendations, but Tulane researcher Nick Mattei is hopeful that a new study into fairness in algorithms will lead to fairer internet experiences.
Anyone who’s spent a lot of time on social media knows that algorithms are always watching. These machine learning systems know what we like, what we consume online, and what we may want to buy – sometimes even before us – based on the digital footprints we all leave online.
But these highly organized and personalized recommendation systems aren’t always fair and can sometimes inherit biases and biases from data, research design or systems programmers, said Nick Mattei, artificial intelligence expert at Tulane.
âThese systems are at the heart of many of our Internet experiments,â said Mattei, assistant professor of computer science at the Tulane School of Science and Engineering. âImagine a system that always recommends the most expensive products in a category – or only recommends news from a certain set of outlets. These systems can unfairly render options invisible to you on the Internet. “
Mattei is part of a new study funded by the National Science Foundation to design fairer algorithm recommendation systems that can be broadly applied to many organizations, regardless of the types of products or services they recommend to users.
âThe big problem is that equity can and is defined in many ways by many different stakeholders. The question is how to balance all these competing concerns in these complex contexts. “
Nick Mattei, Assistant Professor of Computer Science at Tulane
He is partnering with the $ 930,000 study with Robin Burke and Amy Voida, professors of information science at the University of Colorado, and Kiva.org, a nonprofit lending institution aimed at poor communities. served all over the world. Kiva.org will serve as the Co-Principal Investigator, providing data and the opportunity to test the new algorithms live.
âA key part of the research is understanding how concepts of equity are understood and used in real contexts,â said Mattei. âKiva.org is a microcredit site whose overall objective is to promote development through its platform. “
Fairness in referral systems has been the subject of recent research and media attention, and for good reason, Mattei said. For example, job recommendation systems can leverage data that excludes minorities or other historically under-represented groups from certain opportunities. People whose preferences are significantly different from the average user may not receive high-quality recommendations.
Finding solutions has been difficult because equity has too often been conceived in a simple and narrow way and has remained largely separate from real-world organizational practices.
âThe big problem is that fairness can and is defined in many ways by many different stakeholders: the platform operators, the people receiving the loans and the website users. The question is how to balance all these competing concerns in these complex contexts, âsaid Mattei.
The challenge for Kiva.org, for example, is its wide range of borrowers from different sectors of the economy and the world.
âSometimes being fair to one group can mean being very unfair to another group,â Mattei said. “If, for some reason, all loans to a country are only for women, then there will be no men in the search results, so optimizing for one type of equity may mean stepping away from equity in another dimension. “
To help counter this, Mattei will use mathematical tools from computational social choice, fair allocation, and algorithmic game theory to create new systems for deploying and understanding multi-stakeholder equity in the context of multi-stakeholder equity systems. recommendation.
âMy part of the research uses concepts of multi-agent systems, including computational social choice and fair allocation to create new systems for deploying multi-stakeholder equity in recommender systems,â Mattei said.
Researchers will perform a detailed analysis of equity within Kiva.org, ensuring that the concepts of equity that are implemented in the system are grounded in actual organizational needs. Working with Kiva.org, researchers will conduct interviews and focus groups with various stakeholders, building models of the different ways that equity is used in the organizational context and generalizing these techniques for application to other organizations. .