Academia.eduAcademia.edu

Outline

Algorithmic reparation

Big Data & Society

https://doi.org/10.1177/20539517211044808

Abstract

Machine learning algorithms pervade contemporary society. They are integral to social institutions, inform processes of governance, and animate the mundane technologies of daily life. Consistently, the outcomes of machine learning reflect, reproduce, and amplify structural inequalities. The field of fair machine learning has emerged in response, developing mathematical techniques that increase fairness based on anti-classification, classification parity, and calibration standards. In practice, these computational correctives invariably fall short, operating from an algorithmic idealism that does not, and cannot, address systemic, Intersectional stratifications. Taking present fair machine learning methods as our point of departure, we suggest instead the notion and practice of algorithmic reparation. Rooted in theories of Intersectionality, reparative algorithms name, unmask, and undo allocative and representational harms as they materialize in sociotechnical form. We propose algori...

FAQs

sparkles

AI

What are the limitations of traditional fair machine learning techniques?add

The research reveals that fairness techniques often fail to address systemic inequalities, evidenced by Amazon's biased hiring algorithm which favored men based on historical data back to 2014. Studies show that these efforts can exacerbate the very inequalities they aim to rectify.

How do algorithms reflect and amplify social inequalities?add

The study finds that algorithms perpetuate existing social hierarchies by relying on biased data derived from socially stratified societies. For example, ML systems used in criminal justice often reflect historical discrimination, yielding skewed outcomes against marginalized groups.

What is algorithmic reparation and how does it differ from fairness?add

Algorithmic reparation is a framework prioritizing equity over fairness, aiming to rectify historical injustices through intentional resource allocation. Unlike fairness models, it acknowledges systemic disadvantages and actively seeks to address them within ML systems.

What role does Intersectionality play in machine learning algorithm design?add

The research highlights Intersectionality as a critical lens that reveals systemic power relations affecting marginalized identities, essential for developing fair ML systems. By integrating Intersectional principles, algorithmic design can better account for complex, interwoven identities and social contexts.

What barriers exist in implementing algorithmic reparation in ML systems?add

The study identifies social, legal, and institutional challenges, such as resistance to resource reallocation and restrictions on data usage related to protected attributes. These barriers complicate collaborative efforts between social scientists and computer scientists necessary for meaningful algorithmic reform.

References (109)

  1. Abebe R, Barocas S, Kleinberg J, et al. (2020) Roles for comput- ing in social change. In: Proceedings of the 2020 conference on fairness, accountability, and transparency, January 27-30, 2020, Barcelona, Spain, pp. 252-260.
  2. Alexander M (2010) The New Jim Crow: Mass Incarceration in the Age of Colorblindness. New York: The New Press.
  3. Ames MG (2018) Deconstructing the algorithmic sublime. Big Data & Society 5(1): 1-4.
  4. Amoore L (2020) Cloud Ethics. Durham, NC: Duke University Press.
  5. Angwin J, Larson J, Mattu S, et al. (2016a) How we analyzed the compas recidivism algorithm. ProPublica. Available at: https:// www.propublica.org/article/how-we-analyzed-the-compas- recidivism-algorithm
  6. Angwin J, Larson J, Mattu S, et al. (2016b) Machine bias: There's a software used across the country to predict future criminals. And it's biased against Blacks. ProPublica. Available at: https:// www.propublica.org/article/machine-bias-risk-assessments-in- criminal-sentencing
  7. Angwin J, Mattu S and Larson J (2015) The tiger mom tax: Asians are nearly twice as likely to get a higher price from Princeton review. ProPublica. Available at: https://www.propublica.org/ article/asians-nearly-twice-as-likely-to-get-higher-price-from- princeton-review
  8. Barocas S, Crawford K, Shapiro A, et al. (2017a) The problem with bias: Allocative versus representational harms in machine learning. In: 9th annual conference of the special interest group for computing, information and society October 29, 2017. Philadelphia, PA.
  9. Barocas S, Hardt M and Narayanan A (2017b) Fairness in machine learning: Limitations and opportunities. NIPS Tutorial 1. https://fairmlbook.org/pdf/fairmlbook.pdf.
  10. Bauer HH (1990) Barriers against interdisciplinarity: Implications for studies of science, technology, and society (STS). Science, Technology, & Human Values 15(1): 105-119.
  11. Becker GS (2010 [1957]) The Economics of Discrimination. Chicago, IL: University of Chicago Press.
  12. Beede DN, Julian TA, Langdon D, et al. (2011) Women in STEM: A gender gap to innovation. Economics and Statistics Administration Issue Brief 4(11).
  13. Benjamin R (2016) Catching our breath: Critical race STS and the carceral imagination. Engaging Science, Technology, and Society 2: 145-156.
  14. Benjamin R (2019) Race After Technology: Abolitionist Tools for the New Jim Code. Cambridge, UK: Polity.
  15. Berk R, Heidari H, Jabbari S, et al. (2018) Fairness in criminal justice risk assessments: The state of the art. Sociological Methods & Research. 50(1): 3-44..
  16. Birhane A and Guest O (2020) Towards decolonising computa- tional sciences. arXiv preprint arXiv:2009.14258
  17. Bittker B (2018 [1972]) The Case for Black Reparations. Boston, MA: Beacon Press.
  18. Brayne S (2017) Big data surveillance: The case of policing. American Sociological Review 82(5): 977-1008.
  19. Brayne S, Rosenblat A and Boyd D (2015) Predictive policing. Data & Civil Rights: A New Era Of Policing And Justice. Available at: https://datacivilrights.org/pubs/2015-1027/ Predictive_Policing.pdf
  20. Broussard M (2018) Artificial Unintelligence: How Computers Misunderstand the World. Cambridge, MA: MIT Press.
  21. Browne S (2015) Dark Matters: On the Surveillance of Blackness. Durham, NC: Duke University Press.
  22. Bucher T (2018) If… Then: Algorithmic Power and Politics. NY: Oxford University Press.
  23. Caplan R, Donovan J, Hanson L, et al. (2018) Algorithmic Accountability: A Primer. NY: Data & Society. Available at: https://datasociety.net/library/algorithmic-accountability-a- primer/
  24. Carastathis A (2016) Intersectionality: Origins, Contestations, Horizons. Lincoln, NE: University of Nebraska Press.
  25. Carceral Tech Resistance Network (2020, March 30). Available at: http://carceral.tech/practice.
  26. Carrigan C and Bardini M (2021) Majorism: Neoliberalism in student culture. Anthropology & Education Quarterly. In press. 52(1): 42-62.
  27. Chepp V, Collins PH (2013) Intersectionality. In: Celis K, Kantola J, Waylen G, et al. (eds) The Oxford Handbook of Gender and Politics. New York: Oxford University Press, pp. 57-87.
  28. Cho S, Crenshaw KW and McCall L (2013) Toward a field of inter- sectionality studies: Theory, applications, and praxis. Signs: Journal of Women and Culture in Society 38(4): 785-810.
  29. Chouldechova A and Roth A (2020) A snapshot of the frontiers of fairness in machine learning. Communications of the ACM 63(5): 82-89.
  30. Christin A, Rosenblat A and Boyd D (2015) Courts and Predictive Algorithms. NY: Data & Society. Available at: https:// datasociety.net/library/data-civil-rights-courts-and-predictive- algorithms/
  31. Coates T-N (2014) The case for reparations. The Atlantic 313. Available at: https://www.theatlantic.com/magazine/archive/ 2014/06/the-case-for-reparations/361631/
  32. Collins PH (2002) Black Feminist Thought: Knowledge, Consciousness, and the Politics of Empowerment. London, UK: Routledge.
  33. Collins PH (2019) Intersectionality as Critical Social Theory. Durham, NC: Duke University Press.
  34. Collins PH and Bilge S (2020) Intersectionality. Medford, MA: Polity Press.
  35. Cook KS and Hegtvedt KA (1983) Distributive justice, equity, and equality. Annual Review of Sociology 9(1): 217-241.
  36. Corbett-Davies S and Goel S (2018) The measure and mismeasure of fairness: A critical review of fair machine learning. arXiv preprint arXiv:1808.00023. Available at: https://arxiv.org/ abs/1808.00023
  37. Costanza-Chock S (2020) Design Justice: Community-led Practices to Build the Worlds we Need. Cambridge, MA: MIT Press.
  38. Crawford K (2021) Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. New Haven, CT: Yale University Press.
  39. Crawford K, Dobbe R, Dryer T, et al. (2019) AI NOW 2019 Report. NY: AI NOW Institute. Available at: https:// ainowinstitute.org/AI_Now_2019_Report.pdf
  40. Crenshaw K (1990) Mapping the margins: Intersectionality, iden- tity politics, and violence against women of color. Stanford Law Review 43(6): 1241-1299.
  41. Dastin J (2018) Amazon Scraps secret AI recruiting tool that showed bias against women. Reuters 9. Available at: https:// www.reuters.com/article/us-amazon-com-jobs-automation-insight- idUSKCN1MK08G
  42. Davis JL (2020) How Artifacts Afford: The Power and Politics of Everyday Things. Cambridge, MA: MIT Press.
  43. Deutsch M (1975) Equity, equality, and need: What determines which value will be used as the basis of distributive justice? Journal of Social Issues 31(3): 137-149.
  44. Dieterich W, Mendoza C and Brennan T (2016) COMPAS risk scales: Demonstrating accuracy equity and predictive parity. Northpointe Inc. Available at: http://go.volarisgroup. com/rs/430-MBX-989/images/ProPublica_Commentary_Final_ 070616.pdf.
  45. D'Ignazio C and Klein LF (2020) Data Feminism. Cambridge, MA: MIT Press.
  46. Donovan J (2020) You purged racists from your website? Great, now get to work. Wired. Available at: https://www.wired. com/story/you-purged-racists-from-your-website-great-now-get- to-work/
  47. Eubanks V (2018) Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. New York: St Martin's Press.
  48. Fazelpour S and Lipton ZC (2020) Algorithmic fairness from a non-ideal perspective. In: Proceedings of the AAAI/ACM conference on AI, ethics, and society, February 7-9 New York, pp. 57-63. Association for Computing Machinery.
  49. Feldman M, Friedler SA, Moeller J, et al. (2015) Certifying and removing disparate impact. In: Proceedings of the 21th ACM SIGKDD international conference on knowledge discovery and data mining, August 10-13th Sydney, Australia, pp. 259-268.
  50. Ferguson AG (2017) Illuminating black data policing. Ohio State Journal of Criminal Law 15: 503-525.
  51. Ferree MM (2018) Intersectionality as theory and practice. Contemporary Sociology: A Journal of Reviews 47(2): 127- 132.
  52. Flynn S (2020) 13 Cities where police are banned from using facial recognition tech. Innovation and Tech Today. Available at: https://innotechtoday.com/13-cities-where- police-are-banned-from-using-facial-recognition-tech/
  53. Gillespie T (2014) The relevance of algorithms. In: Gillespie T, Boczkowski PJ and Foot KA (eds) Media Technologies: Essays on Communication, Materiality, and Society. Cambridge, MA: MIT Press, pp. 167-193.
  54. Gillespie T (2018) Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. New Haven, CT: Yale University Press.
  55. Glazebrook K and Sundaram K (2020) Why we don't use AI in hiring decision. Applied. Available at: https://www.beapplied. com/post/why-we-dont-use-ai-for-hiring-decisions
  56. Green B and Viljoen A (2020) Algorithmic realism: expanding the boundaries of algorithmic thought. In: Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency, pp. 19-31. January 27-30, Barcelona, Spain.
  57. Gunstone A (2016) Reconciliation, reparations and rights. In: Short CLaD (ed) Handbook of Indigenous Peoples' Rights. London, UK: Routledge pp. 301-312.
  58. Hackett EJ and Rhoten DR (2011) Engaged, embedded, enjoined: Science and technology studies in the National Science Foundation. Science and Engineering Ethics 17(4): 823-838.
  59. Hanna A, Denton E, Smart A, et al. (2020) Towards a critical race methodology in algorithmic fairness. In: Proceedings of the 2020 conference on fairness, accountability, and transparency, pp. 501-512. January 27-30th. Barcelona, Spain.
  60. Harrison S (2019) Five years of tech diversity reports-and little progress. Wired. Available at: https://www.wired.com/story/ five-years-tech-diversity-reports-little-progress/
  61. Heaven WD (2020) Predictive policing algorithms are racist they need to be dismantled. MIT Technology Review. Available at: https://www.technologyreview.com/2020/07/17/1005396/ predictive-policing-algorithms-racist-dismantled-machine-learning- bias-criminal-justice/
  62. Heilweil R (2020) Big tech companies back away from selling facial recognition to police. That's progress. Vox. Available at: https://www.vox.com/recode/2020/6/10/ 21287194/amazon-microsoft-ibm-facial-recognition-moratorium- police
  63. Henry CP (2009) Long Overdue: The Politics of Racial Reparations. New York: NYU Press.
  64. Hill K (2020) Another arrest, and jail time, due to a bad facial recognition match. The New York Times. Available at: https://www.nytimes.com/2020/12/29/technology/facial- recognition-misidentify-jail.html
  65. Hoffmann AL (2019) Where fairness fails: Data, algorithms, and the limits of antidiscrimination discourse. Information, Communication & Society 22(7): 900-915.
  66. Hooks B (2000) Feminist Theory: From Margin to Center. London, UK: Pluto Press.
  67. Hu L and Kohler-Hausmann I (2020) What's sex got to do with machine learning. arXiv preprint arXiv:2006.01770.
  68. Humerick JD (2019) Reprogramming fairness: Affirmative action in algorithmic criminal sentencing. HRLR Online 4(2): 213- 244.
  69. Jo ES and Gebru T (2020) Lessons from archives: Strategies for collecting sociocultural data in machine learning. In: Proceedings of the 2020 conference on fairness, accountability, and transparency, pp. 306-316. January 27-30, Barcelona, Spain.
  70. Kalluri P (2020) Don't ask if AI is good or fair, ask how it shifts power. Nature World View. 583(169) Available at: https:// media.nature.com/original/magazine-assets/d41586-020- 02003-2/d41586-020-02003-2.pdf
  71. Karst KL (1977) Foreword: Equal citizenship under the fourteenth amendment. Harvard Law Review 91: 1-301.
  72. Kayser-Bril N (2020) Google apologizes after its vision AI pro- duced racist results. Algorithm Watch. Available at: https:// algorithmwatch.org/en/story/google-vision-racism/
  73. Kearns M and Roth A (2019) The Ethical Algorithm: The Science of Socially Aware Algorithm Design. Oxford, UK: Oxford University Press.
  74. Kitchin R (2017) Thinking critically about and researching algo- rithms. Information, Communication & Society 20(1): 14-29.
  75. Lahey JN and Oxley DR (2018) Discrimination at the intersection of age, race, and gender: Evidence from a lab-in-the-field experiment. Working Paper 25357 National Bureau of Economic Research Cambridge, MA December 2018 https:// www.nber.org/papers/w25357
  76. Lazar S, Benn C and Günther M (2020) Large scale facial recog- nition is incompatible with a free society. The Conversation. Available at: https://theconversation.com/large-scale-facial- recognition-is-incompatible-with-a-free-society-126282
  77. Lenzerini F (2008) Reparations for Indigenous Peoples: International and Comparative Perspectives. Oxford, UK: Oxford University Press.
  78. Lieberwitz R (2008) Employment discrimination law in the United States: on the road to equality? New developments in Employment Discrimination Law, Bulletin of Comparative Labour Relations. In: New developments in employment dis- crimination law conference, Tokyo, Japan. Available at: https://www.jil.go.jp/event/ro_forum/resume/080220/USA_ pdf Long MC and Batemen NA (2020) Long-run changes in underre- presentation after affirmative action bans in public universities. Educational Evaluation and Policy Analysis 42(2): 188-207.
  79. Lu C (2017) Justice and Reconciliation in World Politics. Cambridge, UK: Cambridge University Press.
  80. McCall L (2005) The complexity of intersectionality. Signs: Journal of Women in Culture and Society 30(3): 1771-1800.
  81. Mann M and Matzner T (2019) Challenging algorithmic profiling: The limits of data protection and anti-discrimination in responding to emergent discrimination. Big Data & Society 6(2): 1-11.
  82. Mohamed S, Png M-T and Isaac W (2020) Decolonial AI: Decolonial theory as sociotechnical foresight in artificial intel- ligence. Philosophy & Technology 33(4): 659-684.
  83. Noble SU (2018) Algorithms of Oppression: How Search Engines Reinforce Racism. NY: NYU Press.
  84. Ogbonnaya-Ogburu IF, Smith AD, To A, et al. (2020) Critical race theory for HCI. In: Proceedings of the 2020 CHI conference on human factors in computing systems, pp. 1-16.
  85. O'Neil C (2016) Weapons of Math Destruction: How big Data Increases Inequality and Threatens Democracy. NY: Broadway Books.
  86. Oreopoulos P (2011) Why do skilled immigrants struggle in the labor market? A field experiment with thirteen thousand resumes. American Economic Journal: Economic Policy 3(4): 148-171.
  87. Pasquale F (2015) The Black box Society: The Secret Algorithms That Control Money and Information. Cambridge, MA: Harvard University Press.
  88. Pettit B and Western B (2004) Mass imprisonment and the life course: Race and class inequality in US incarceration. American Sociological Review 69(2): 151-169.
  89. Quillian L, Pager D, Hexel O, et al. (2017) Meta-analysis of field experiments shows no change in racial discrimination in hiring over time. Proceedings of the National Academy of Sciences 114(41): 10870-10875.
  90. Rahman M (2010) Queer as intersectionality: Theorizing gay Muslim identities. Sociology 44(5): 944-961.
  91. Raji ID and Buolamwini J (2019) Actionable auditing: Investigating the impact of publicly naming biased performance results of commercial AI products. In: Proceedings of the 2019 AAAI/ ACM conference on AI, ethics, and society, pp. 429-435.
  92. Rawls J (1971) A Theory of Justice. Cambridge, MA: Harvard university press.
  93. Ray R and Gibbons A (2021) Why are states banning critical race theory? Brookings Institute. Available at: https://www. brookings.edu/blog/fixgov/2021/07/02/why-are-states-banning- critical-race-theory/
  94. Richardson R, Schultz JM and Crawford K (2019) Dirty data, bad predictions: How civil rights violations impact police data, pre- dictive policing systems, and justice. New York University Law Review Online 94: 15-55.
  95. Richter DM and Paretti MC (2009) Identifying barriers to and out- comes of interdisciplinarity in the engineering classroom. European Journal of Engineering Education 34(1): 29-45.
  96. Seaver N (2017) Algorithms as culture: Some tactics for the ethnog- raphy of algorithmic systems. Big Data & Society 4(2): 1-12.
  97. Simonite T (2018) When it comes to gorillas, Google photos remains blind. Wired. Available at: https://www.wired.com/ story/when-it-comes-to-gorillas-google-photos-remains-blind/ Skeem JL and Lowenkamp CT (2016) Risk, race, and recidivism: Predictive bias and disparate impact. Criminology: An Interdisciplinary Journal 54(4): 680-712.
  98. Skeem JL and Lowenkamp CT (2020). Using algorithms to address trade-offs inherent in predictive recidivism. Social Sciences & the Law, 38(3): 259-278.
  99. Stavrianakis A (2015) From anthropologist to actant (and back to anthropology): Position, impasse, and observation in socio- technical collaboration. Cultural Anthropology 30(1): 169- 189.
  100. Suresh H and Guttag JV (2019) A framework for understanding unintended consequences of machine learning. arXiv preprint arXiv:1901.10002. Available at: https://arxiv.org/abs/1901. 10002
  101. Torpey JC (2006) Making Whole What has Been Smashed: On Reparations Politics. Cambridge, MA: Harvard University Press.
  102. Travis J, Western B and Redburn FS (2014) The Growth of Incarceration in the United States: Exploring Causes and Consequences. Washington, DC: National Research Council. Available at: https://www.nap.edu/catalog/18613/the-growth- of-incarceration-in-the-united-states-exploring-causes
  103. Vaidhyanathan S (2018) Antisocial Media: How Facebook Disconnects us and Undermines Democracy. New York: Oxford University Press.
  104. Viseu A (2015) Integration of social science into research is crucial. Nature News 525(7569): 291.
  105. Vought R (2020) Memorandum for the heads of executive depart- ments and agencies: Training in the federal government. United States Office of Management and Budget. Available at: https://www.whitehouse.gov/wp-content/uploads/2020/09/ M-20-34.pdf
  106. Western B and Pettit B (2010) Incarceration & social inequality. Daedalus 139(3): 8-19.
  107. Western B and Sirois C (2019) Racialized re-entry: Labor market inequality after incarceration. Social Forces 97(4): 1517-1542.
  108. Woelert P and Millar V (2013) The 'paradox of interdisciplinarity' in Australian research governance. Higher Education 66(6): 755-767.
  109. Young IM (2010) Responsibility for Justice. Oxford, UK: Oxford University Press.