Вы находитесь на странице: 1из 2

October 18, 2019

The Honorable Ben Carson


Secretary
U.S. Department of Housing and Urban Development
1400 7th Street, S.W.
Washington, D.C. 20410

RE: FR-6111-P-02

Dear Secretary Carson:


I am writing to express my opposition to the U.S. Department of Housing and Urban Development’s
(HUD) proposal to amend the Fair Housing Act’s (FHA) disparate impact standard, which would erode
federal protections for victims of housing discrimination. Not only does the proposed rule strip away
existing protections for some of the most vulnerable populations in our country, but it also removes
critical safeguards on emerging technologies. In particular, regarding the use of models and algorithms,
the new guidance significantly undermines FHA protections while disregarding the mechanics of
machine learning. As financial institutions, landlords, and other housing providers increasingly rely on
the use of algorithmic decision making, this proposal weakens protections for consumers, instead of
ensuring industries adhere to best practices as they implement new technologies.
Although I share HUD’s desire to increase access to credit to underserved communities, this proposal
essentially makes lenders and landlords using algorithmic models exempt from the disparate impact
standard. This burden-shifting framework establishes lines of defense for accused violators of the FHA
that are fundamentally flawed and demonstrate a lack of understanding of how machine learning
technologies work.
First, under the proposal, a defendant need only demonstrate that their inputs in a model or algorithm are
not a protected characteristic, or a substitute for a protected characteristic, in order to dismiss an
accusation of a FHA violation.1 However, it is well established that algorithmic bias rarely stems from a
single protected characteristic or a substitute, but, rather, arises from incomplete data sets and historic
human biases2. There are countless examples of this3, not the least of which was HUD’s own decision to
sue Facebook for housing discrimination earlier this year.4 Despite the fact that Facebook removed

1 https://www.federalregister.gov/d/2019-17542/p-95
2 https://www.brookings.edu/research/algorithmic-bias-detection-and-mitigation-best-practices-and-policies-to-reduce-consumer-
harms/#footnote-27
3 In one example, Amazon’s same day delivery service was found to discriminate against primarily black communities, even though

Amazon did not use race or a proxy for race in their decision-making process., see https://www.bloomberg.com/graphics/2016-amazon-
same-day/
4 https://www.hud.gov/press/press_releases_media_advisories/HUD_No_18_085
suspect categories from its menu items, recent research indicates that the algorithmic model in the ad
delivery system itself was also responsible for bias in delivering housing advertisements.5 This shows
that HUD should delve deeper into the mechanics of the decision making tools used by lenders,
landlords, and realtors in order to determine whether these tools cause a disparate impact. HUD’s
position in the Facebook case is plainly inconsistent with the proposed rules.
Second, the proposal removes liability from offenders by shifting responsibility to the third party that
provides the decision making tool. Should a bank or rental company engage in unfair practices through a
biased algorithm, no party would be held accountable. Allowing lenders, landlords and realtors to
outsource the liability for housing discrimination ensures they will ignore discriminatory outcomes and
actually incentivizes them to avoid asking their vendors questions. Rather than absolving housing
stakeholders from responsibility, HUD should be encouraging actors to take steps to address biases. In
the Algorithmic Accountability Act, which I recently introduced with Senator Ron Wyden and
Congresswoman Yvette Clarke, we outline methods with which the federal government can mitigate the
impacts of biased and discriminatory algorithms. HUD should require housing providers to adopt these
policies, which include third party audits of algorithmic systems that study impacts on accuracy,
fairness, bias, discrimination, privacy, and security.
Should the administration enact the proposed rule, millions of Americans could be subject to housing
discrimination without recourse. At a time when the country faces an affordable housing crisis6, the
proposal not only weakens the federal government’s ability to execute the Fair Housing Act, but could
potentially perpetuate biased algorithmic decision making across industries. I urge you to reconsider and
rescind this proposal, and I appreciate your consideration of my request.

Sincerely,

Cory A. Booker
United States Senator

5 Ali,M., Sapiezynski, P., Bogen, M., Korolova, A., Mislove, A. and Rieke, A. (2019), Discrimination Through Optimization: How
Facebook’s Ad Delivery Can Lead To Skewed Outcomes. arXiv preprint arXiv:1904.02095.
6 https://www.citylab.com/perspective/2019/10/affordable-housing-crisis-cities-rent-zoning-

development/599758/?utm_campaign=citylab&utm_term=2019-10-11T14%3A33%3A22&utm_content=edit-
promo&utm_medium=social&utm_source=twitter

Вам также может понравиться