On Wednesday, Congress held a hearing on diversity in the technology industry to address the sector’s abysmal statistics on inclusion. Lawmakers and expert witnesses touted increasing diversity as a necessary tool to reduce the potential harmful biases being coded into technology.
“The industry’s workforce has remained mostly homogenous. People of color, women, and older Americans have all been notably absent from the tech workforce,” said Rep. Jan Schakowsky (D-Ill.), who chairs the House Energy and Commerce Committee’s consumer protection subcommittee, during her opening remarks. “The technology itself reflects that lack of diversity. That has real impact against Americans. We’ve seen algorithms’ bias in sentencing guidelines resulting in harsher sentences for minorities,” she continued, expressing a sentiment shared by many lawmakers during the hearing—fix diversity and you can fix many of the structural ills of technology that have been baked into its algorithms.
While ensuring more diverse representation in the workforce is an important step toward making technology more equitable, research and track records elsewhere suggest it is not a panacea in rooting out inequities, especially at big organizations in high-impact, complex fields.
More diverse hiring practices have increased minority participation in police forces, but research suggests diverse departments do little to reduce law enforcement’s disparate racial impact; studies have shown they fail to reduce the disproportionately high amount of lethal force wielded in black communities and are not better at addressing the needs of black communities.
Policing isn’t the only area where better representation of a group hasn’t alleviated mistreatment of that group. More diverse educators have not closed the racial education inequity gap. The prestigious and clubby financial services industry, which has a comparatively better diversity record, is sometimes touted as a model for tech to follow. But those employees didn’t stop the industry from targeting black families with sub-prime mortgages. President Barack Obama’s race didn’t prevent his administration from pursuing housing policies that were detrimental to black wealth. Big tech doesn’t have to look far for similar examples: IBM has one of the oldest workforces among major technology companies, but that hasn’t kept it from being sued for age discrimination after firing thousands of its eldest employees.
Lack of improvement isn’t the fault of minorities working within these organizations, but is likely caused by societal and organizational factors that remain impervious to changes in personnel. As Charles E. Menifield and Geiguen Shin of Rutgers University, along with Princeton University’s Logan Strother, wrote in a paper on race and policing, improvements should focus on “fundamental macro-level policy changes, as well as changes to meso-level organizational practices” that are better equipped than hiring practices “to address the root causes of racial disparities.” The tech sector is likely no different.
While it is possible that having more minorities at Facebook could have helped the company avoid creating tools that enabled advertisers to discriminate by race with housing ads—which it now says it has fixed—the social-media giant faces a host of problems that are reflective of real-world, systemic racial bias, and where it is less obvious how a more diverse workforce would guarantee change.
The company still makes it possible for advertisers to discriminate against groups of users by targeting ads to proxies for race like zip code, estimated home value, and estimated income. It’s hard to imagine how more diverse engineers would improve predictive policing algorithms when the crime datasets they’d be built with are usually biased. Facial recognition software is, for now, far more accurate at identifying white faces, which could make darker-skinned people more vulnerable to falsely falling under police suspicion. But even if that weren’t the case, the software wouldn’t address already existing issues of bias in the police forces set to use the software. Such limits in reducing bias make representation more of a first step than the solution lawmakers on the energy and commerce committee want it to be.