SafeRent is a machine learning black box for landlords. It gives landlords a numerical rating of potential tenants and a yes/no result on whether to rent to them.
In May 2022, Massachusetts housing voucher recipients and the Community Action Agency of Somerville sued the company, claiming SafeRent gave Black and Hispanic rental applicants with housing vouchers disproportionately lower scores.
The tenants had no visibility into how the algorithm scored them. Appeals were rejected on the basis that this was what the computer output said.
The land lords who used the service should also be held liable. You mean to tell me you get a report with a binary answer and you just trust it with no due diligence? If there is no penalty for blindly trusting an algorithm they will just move to the next tool they can use to be bigots.
If there are suicides linked to wronged applicants, they should be charged with at least “involuntary” manslaughter
How do you criminally charge an organization? Like who’s reponsible? CEO? Stockholders? The Board of Directors?
Here’s an explanation from the Associated Press. The penalty is usually a fine, which impacts stockholders by making the stock less valuable and could lead them to remove board members or demand the termination of executives. It’s rarely used, but there is a corporate death penalty.
The fact that I’ve never heard of the corporate death penalty until now, but they’re bringing back the actual death penalty says everything.
“I’ll believe corporations are people when Texas executes one.”
Now they’re promising to only be pretty racist.
OK some people got paid… The problem didn’t get solved
Classic america
There was lecture by Cory Doctrow about it.