Your Data Is Discriminating...Against You

For some, privacy infringement doesn’t just mean annoying ads; it could mean being denied a job or housing. Prachi Gupta investigates big data’s big problem.

algorithms can discriminate against certain minorities
(Image credit: Susanna Hayward / Getty Images)

In 2016, Carmen Arroyo’s 22-year-old son, Mikhail, regained consciousness from a six-month coma. He had been electrocuted while atop an electrical pole and had fallen nearly 30 feet, leaving him unable to walk, speak, or take care of himself. Arroyo, then 44, filed an application with her landlord requesting permission for her son to move into her apartment in Willimantic, Connecticut, with her. According to court records, the application was quickly denied without explanation, and Mikhail was sent to a rehabilitation facility, where he would remain for more than a year while his mother searched for a reason why.

Arroyo contacted the Connecticut Fair Housing Center (CFHC), a nonprofit that provides free legal services to alleged victims of housing discrimination. In the process of filing a complaint against the landlord, Arroyo and her lawyers discovered that the landlord didn’t know why Arroyo was denied either; the decision hadn’t been made by him but by an algorithm used by CoreLogic, a software company he had enlisted to screen potential tenants. After Arroyo filed her complaint, the landlord allowed Mikhail to move in with his mother. Arroyo’s lawyers kept digging and ultimately determined what caused the rejection: a citation for shoplifting from 2014 (which has been withdrawn), according to court documents. “He was blacklisted from housing, despite the fact that he is so severely disabled now and is incapable of committing any crime,” says Salmun Kazerounian, a staff attorney from CFHC who represents Arroyo.

What happened to the Arroyo family is just one example of data leading to discrimination. Automated data systems—technology like CoreLogic’s—use collected intel (public data, such as DMV and court records, that may also be packaged with information scraped from the Internet, like social-media activity) to make life-altering decisions, including whether applicants get jobs, the cost of their insurance, or how a community is policed. In theory, these systems are built to eliminate bias present in human decision-making. In reality, they can fuel it.

That is in part because algorithms are made up of biased data and often don’t consider other relevant factors. Because low-income people have more contact with government agencies (for benefits like Medicaid), a disproportionate amount of their info feeds these systems. Not only can this data fall into corporate hands, but the government itself uses it to surveil. For example, when UC Berkeley law professor Khiara Bridges interviewed pregnant women applying for prenatal care under Medicaid, she found that they had to reveal their sexual histories and incidences of domestic violence—details that can then be shared with other public agencies. “I talked to pregnant women who came to the clinic just to get prenatal care, and then the next day they would get a call from Child Protective Services,” Bridges says. When people seek support from the state, “that can end up penalizing them later,” adds University of Baltimore law professor Michele E. Gilman. A person applying for a public benefit can be flagged as a risk, which limits future housing or employment opportunities. People who don’t need to apply for public benefits are exempt from these injustices.

The problem is pervasive, invisible, and cyclical: Biased data is used to justify surveillance, creating an endless feedback loop of discrimination. In 2012, the Chicago Police Department began using predictive analytics, reliant mostly on arrest-record data, to increase surveillance on certain individuals it considered more likely to commit, or be victims of, gun violence. The program was shelved in 2019 after findings showed it was ineffective at reducing homicides. Civil-rights groups said it perpetuated racial bias. “The algorithm is developed from what you give it,” says Brandi Collins-Dexter, senior campaign director for racial-justice-advocacy group Color of Change. “If you give it trash, it’s going to give you trash.” Feed an algorithm biased information and it will enable future bias.

This reality is the crux of the Arroyo case, the first of its kind: Mikhail, who is Latino, is one of the nearly one third of working-age Americans who have a criminal record, a disproportionate number of whom are Black or Latinx. His lawyers are suing CoreLogic, arguing that, under the guise of neutrality and efficiency, its software reinforces discriminatory policies. (The lawsuit is still pending, and CoreLogic has denied any wrongdoing.) If Arroyo wins, it will be a small step forward. But, unless the U.S. adopts stronger data-privacy legislation, these life-altering structures will go largely unchecked, as they are both “powerful and invisible,” according to Jennifer Lee of the ACLU of Washington. Her organization is just one pushing for regulations. Until that happens, we will continue to be watched by discriminatory systems unseen—and minority groups will feel those eyes most of all.

Algorithms can use data to make decisions about a specific person, like Arroyo, but some use it to make inferences on entire groups. This can lead to sweeping, discriminatory generalizations: Facebook has come under fire for showing different housing and job ads to different users based on race and gender; in 2018, Amazon discovered that its automated hiring tool picked male candidates over female ones. In these instances, bias was baked into decision-making tools. Even if the data is not specifically about race or gender, data points—like a person’s music interests or zip code—can become “proxies,” according to Collins-Dexter.

This story originally appeared in the Fall 2020 issue of Marie Claire.

subscribe here

RELATED STORIES

Your body

(Image credit: Susanna Hayward / Getty Images)

workplace

(Image credit: Susanna Hayward / Getty Images)
Prachi Gupta

Prachi Gupta is an award-winning journalist based in New York. Her first book, about Rep. Alexandria Ocasio-Cortez, is available via Workman Publishing.