Silicon Sonnets

Computer says 'no': Exploring systemic bias in ChatGPT using an audit approach

February 06, 2024
Silicon Sonnets
Computer says 'no': Exploring systemic bias in ChatGPT using an audit approach
Show Notes

This study investigates ethnic and gender biases in ChatGPT's job applicant evaluations using a correspondence audit approach with 34,560 vacancy-CV combinations. Findings reveal significant ethnic and gender discrimination, particularly in favorable jobs or those requiring language proficiency, and gender-atypical roles. The study calls for policy and development interventions to address systemic bias in language model applications.