Untold History of AI: Algorithmic Bias Was Born in the 1980s
A medical school thought a computer program would make the admissions process fairer—but it did just the opposite

Oscar Schwartz:

While Franglen’s main motivation was to make admissions processes more efficient, he also hoped that it would remove inconsistencies in the way the admissions staff carried out their duties. The idea was that by ceding agency to a technical system, all student applicants would be subject to precisely the same evaluation, thus creating a fairer process.

In fact, it proved to be just the opposite.

Franglen completed the algorithm in 1979. That year, student applicants were double-tested by the computer and human assessors. Franglen found that his system agreed with the gradings of the selection panel 90 to 95 percent of the time. For the medical school’s administrators, this result proved that an algorithm could replace the human selectors. By 1982, all initial applications to St. George’s were being screened by the program.

If their names were non-Caucasian, the selection process was weighted against them. In fact, simply having a non-European name could automatically take 15 points off an applicant’s score.
Within a few years, some staff members had become concerned by the lack of diversity among successful applicants. They conducted an internal review of Franglen’s program and noticed certain rules in the system that weighed applicants on the basis of seemingly non-relevant factors, like place of birth and name. But Franglen assured the committee that these rules were derived from data collected from previous admission trends and would have only marginal impacts on selections.

In December 1986, two senior lecturers at St. George’s caught wind of this internal review and went to the U.K. Commission for Racial Equality. They had reason to believe, they informed the commissioners, that the computer program was being used to covertly discriminate against women and people of color.

The commission launched an inquiry. It found that candidates were classified by the algorithm as “Caucasian” or “non-Caucasian” on the basis of their names and places of birth. If their names were non-Caucasian, the selection process was weighted against them. In fact, simply having a non-European name could automatically take 15 points off an applicant’s score. The commission also found that female applicants were docked three points, on average. As many as 60 applications each year may have been denied interviews on the basis of this scoring system.