Академический Документы
Профессиональный Документы
Культура Документы
RE: FR-6111-P-02
1 https://www.federalregister.gov/d/2019-17542/p-95
2 https://www.brookings.edu/research/algorithmic-bias-detection-and-mitigation-best-practices-and-policies-to-reduce-consumer-
harms/#footnote-27
3 In one example, Amazon’s same day delivery service was found to discriminate against primarily black communities, even though
Amazon did not use race or a proxy for race in their decision-making process., see https://www.bloomberg.com/graphics/2016-amazon-
same-day/
4 https://www.hud.gov/press/press_releases_media_advisories/HUD_No_18_085
suspect categories from its menu items, recent research indicates that the algorithmic model in the ad
delivery system itself was also responsible for bias in delivering housing advertisements.5 This shows
that HUD should delve deeper into the mechanics of the decision making tools used by lenders,
landlords, and realtors in order to determine whether these tools cause a disparate impact. HUD’s
position in the Facebook case is plainly inconsistent with the proposed rules.
Second, the proposal removes liability from offenders by shifting responsibility to the third party that
provides the decision making tool. Should a bank or rental company engage in unfair practices through a
biased algorithm, no party would be held accountable. Allowing lenders, landlords and realtors to
outsource the liability for housing discrimination ensures they will ignore discriminatory outcomes and
actually incentivizes them to avoid asking their vendors questions. Rather than absolving housing
stakeholders from responsibility, HUD should be encouraging actors to take steps to address biases. In
the Algorithmic Accountability Act, which I recently introduced with Senator Ron Wyden and
Congresswoman Yvette Clarke, we outline methods with which the federal government can mitigate the
impacts of biased and discriminatory algorithms. HUD should require housing providers to adopt these
policies, which include third party audits of algorithmic systems that study impacts on accuracy,
fairness, bias, discrimination, privacy, and security.
Should the administration enact the proposed rule, millions of Americans could be subject to housing
discrimination without recourse. At a time when the country faces an affordable housing crisis6, the
proposal not only weakens the federal government’s ability to execute the Fair Housing Act, but could
potentially perpetuate biased algorithmic decision making across industries. I urge you to reconsider and
rescind this proposal, and I appreciate your consideration of my request.
Sincerely,
Cory A. Booker
United States Senator
5 Ali,M., Sapiezynski, P., Bogen, M., Korolova, A., Mislove, A. and Rieke, A. (2019), Discrimination Through Optimization: How
Facebook’s Ad Delivery Can Lead To Skewed Outcomes. arXiv preprint arXiv:1904.02095.
6 https://www.citylab.com/perspective/2019/10/affordable-housing-crisis-cities-rent-zoning-
development/599758/?utm_campaign=citylab&utm_term=2019-10-11T14%3A33%3A22&utm_content=edit-
promo&utm_medium=social&utm_source=twitter