Twitter has announced intentions to launch its first algorithmic bias challenge, one that offers cash rewards for individuals who help uncover “potential harms” caused by the company’s saliency algorithm. This new competition builds upon the bias identification approach Twitter detailed back in May. The challenge will take place as part of DEF CON AI Village this year.
Algorithms play an important role in social media platforms, helping users see more of the content they may be interested in and less of the content they’re less likely to interact with. These same algorithms can unintentionally introduce bias to these platforms, however, raising criticism among those impacted by this problem.
Twitter talked about this issue in a blog post back in May, describing some of the algorithm bias that can surface and the steps the company had taken to address this problem. In an update on the matter today, Twitter revealed that it is taking things one step farther with its new bias bounty challenge, the first of its kind in the industry.
Detailing the reason for this new challenge, Twitter said in a blog post:
We’re inspired by how the research and hacker communities helped the security field establish best practices for identifying and mitigating vulnerabilities in order to protect the public. We want to cultivate a similar community, focused on ML ethics, to help us identify a broader range of issues than we would be able to on our own.
Twitter describes the effort as a proactive step towards finding and dealing with unintended harms caused by algorithms. Twitter will share its saliency model and code as part of this challenge with individuals who want to participate. The company will host a workshop during DEF CON AI Village on August 8, at which point the winners will be announced.
The first-place winner will receive $3,500, while the second place, ‘most generalized,’ and ‘most innovative’ winners will each get $1,000. The person who takes third place gets $500. The challenge kicked off today; interested participants should head over to its page for more info.