As it turns out, surveillance cameras that have been “trained” to spot and read license plates aren’t all that good at discerning real ones from fakes. That makes it pretty easy to trick Automatic License Plate Reader (ALPR) systems with images of fake plates, making it possible to flood their databases with unusable information.
When hacker and fashion designer Kate Rose learned – through a conversation with Dave Maass, a researcher with the Electronic Frontier Foundation – that the plate readers kind of suck at their jobs, she got an idea. Her new line “Adversarial Fashion” is the result. Unveiled at the DefCon cybersecurity conference in Las Vegas last week, the garments spell out the words of the fourth amendment of the US constitution, which protects Americans from “unreasonable searches and seizures.”
The cameras, however, read the garments as real license plates, and the proof is in their databases. ALPRs are always on, and can collect thousands of plates per minute, so for the system, there’s nothing unusual about capturing so many individual plates at a time. As Rose’s presentation at DefCon noted, overloading this kind of surveillance technology is one of the main methods of confounding it (along with blocking the collection of information.)
If you’re interested in making your own, Rose has provided all of the information you need to do it. But her line of Adversarial Fashion is pretty affordable, with prices starting at $24.99 – check it out here, or follow the brand new Instagram account @adversarialfashion.
Backdoor plan: On Monday, the Trump administration will introduce a new rule making it harder for people to bring discrimination complaints under the Fair Housing Act. The proposal from the U.S. Department of Housing and Urban Development would reverse Obama-era rules and raise the burden of proof for parties claiming discrimination through “disparate impact,” a civil rights legal theory that allows challenges to policies that have an adverse affect on minorities without explicit discrimination.
The new HUD rule also carves out unprecedented guidance for the automated decision-making tools that power the housing market. It outlines new defenses for landlords, lenders, and others accused of discrimination, shielding their use of third-party algorithms that measure credit risk, home insurance, mortgage interest rates, and more. But critics say the loophole could essentially build an industry backdoor to bias in housing. CityLab’s Kriston Capps has the story: How HUD Could Dismantle a Pillar of Civil Rights Law
Editor’s note: This section of Thursday’s newsletter included a broken link! You can read Laura Bliss’s “The Ocean Can't Claim the Rockaways Yet” here.
For a lot of folks, a week at the seashore means an escape from the city. Even the rallying cries of French anarchists (“Beneath the paving stones, the beach!”) found the promise of untamed shores useful as a relaxing foil to the urban grind. The beach is where the water meets the city, washing up the first signs of change that are bound to end up on land soon.
To that end, we’ve been basking in the mid-August wonder that is Beach Week. With stories about a lack of affordable housing on a tony island, an oceanfront neighborhood weathering the waves of climate change and gentrification, a beach bum innovation in bikes, and how to survive a shark attack in your mayoral election, we’re sure to have a few beach reads for you. See the CityLab Beach Week series here.
Is a recession coming? Here’s what that means for housing (Curbed)
Have developers found a gentler way to gentrify? (New York Times)
Mexico City’s rain-harvesting could change how cities manage water (Next City)
Higher prices threaten Silicon Valley’s micromobility revolution (Time)
Why Route 66 became America’s most famous road (Vox)
Bee Breeders have announced the winners of the SKYHIVE Skyscraper Challenge. The purpose of the competition was to allow architects, design students, engineers, and artists from all over the world to generate design ideas for iconic high rise buildings in cities around the globe. As part of this design series, participants were encouraged to incorporate new technologies, materials, forms, spatial organizations, and construction systems in their designs for a skyscraper.
The Trump administration will introduce a new rule on Monday that may reshape the way the government enforces fair housing law, making it harder for people to bring forward discrimination complaints under the Fair Housing Act.
The proposed regulation from the U.S. Department of Housing and Urban Development would replace an Obama-era rule on disparate impact, a legal theory that has guided fair housing law for more than 50 years. Disparate impact refers to practices or policies that have an adverse impact on minorities without discriminating against them in explicit terms. The Supreme Court has recognized this form of bias as prohibited under the Fair Housing Act. But the new rule from HUD would substantially raise the burden of proof for parties claiming discrimination.
The new regulation also goes further: The HUD rule carves out an unprecedented guidance for the automated decision-making systems that power the housing market. These are the algorithms used by lenders and landlords that deliver judgments on credit risk, home insurance, mortgage interest rates, and more. Under the new dispensation, lenders would not be responsible for the effects of an algorithm provided by a third party—a standard that critics say would build an industry backdoor to bias.
“This is a proposal to very dramatically revise and effectively destroy an existing 2013 civil rights regulation,” says Megan Haberle, deputy director for the Poverty & Race Research Action Council. “This is a core part of the Fair Housing Act, and very early fair housing cases across the country have recognized the discriminatory effects standard.”
Housing Secretary Ben Carson signaled that the department was rethinking the disparate impact doctrine last June. The new rule, a version of which was leaked to Politico, will be published in the Federal Register on Monday, triggering a 60-day comment period before it can be officially implemented.
Under the current rule, disparate impact cases proceed by meeting a three-part burden-shifting test: A plaintiff makes an allegation, a defendant offers a rebuttal, then the plaintiff responds. The new rule would set a five-point prima facie evidentiary test on the plaintiff side alone. This means that a party looking to bring a discrimination case under the Fair Housing Act would need to establish some level of evidence in the pleading stage. To bring forward an accusation of implicit discrimination, plaintiffs would need to demonstrate—before any discovery process—that the policy itself is flawed.
Under the five-point burden test, plaintiffs would need to 1) prove that a policy is “arbitrary, artificial, and unnecessary” to achieve a valid interest; 2) demonstrate a “robust causal link” between the practice and the disparate impact; 3) show that the policy negatively affects “members of a protected class” based on race, color, religion, sex, family status, or national origin; 4) indicate that the impact is “significant”; and 5) prove that the “complaining party’s alleged injury” is directly caused by the practice in question.
“This shifts so much of the responsibility to the plaintiff to make allegations nearly impossible, without having gone through a discovery process, of this tight causal link between the policy and the effect,” says Urban Institute senior fellow Solomon Greene.
In addition, the new HUD rule would establish three new defenses for landlords, lenders, and others accused of discrimination based on models and algorithms. The first defense would enable defendants to indicate that a model isn’t the cause of the harm. The second would allow the defendant to show that a model or algorithm is being used as intended, and is the responsibility of a third party. Finally, the new rule would allow the defendant to call on a qualified expert to show that the alleged harm isn’t a model’s fault.
Critics say that this new development gives lenders and landlords a big loophole. Many if not most financial institutions are not capable of developing their own in-house credit-risk algorithms; instead, they turn to third-party vendors. By putting the onus of fairness on these vendors, HUD is establishing a perverse incentive for banks and vendors alike to decline to study the outcomes of automated decision-making systems, according to Jacob Metcalf, a researcher for the nonprofit research institute Data & Society and founder of Ethical Resolve, a data-ethics consultancy.
“As long as the bank or lender is buying this tool from a third party that claims it has been adequately tested for algorithmic fairness, then the bank or lender is shielded from liability,” Metcalf says. “That’s a problem because there are no established standards—and the HUD rule doesn’t set out to establish any standards—about disparate impact.”
If a bank is not liable for a disparate impact created by an algorithm, then it will have no incentive to shop for a company with a model that abides by a proven standard. Any vendor that invests the time and labor in demonstrating that its model does not discriminate across a wide variety of housing markets does so at a comparative disadvantage. There’s an incentive for all involved to not know.
Meanwhile, a plaintiff has no way of knowing what data a vendor uses to model something like credit risk. Third parties would be able to shield their practices behind trade secrets; any plaintiff looking to suss out whether an algorithm has a discriminatory impact might wind up “lost in a web of vendor relationships,” Metcalf says, with little recourse, especially prior to the discovery stage.
This is the first federal regulation to directly address algorithms and disparate impact. Attorneys couldn’t point to any caselaw that addresses algorithmic models and disparate impact, either. It’s not a wholly unreasonable idea for a regulation, Metcalf says: Many banks don’t have the resources to gauge the liability of an algorithm. But without sufficient due-diligence standards, vendors will have every incentive to drag their feet. And as long as their models aren’t blatantly discriminatory, then the vendors likely wouldn’t be held responsible for disparate impacts, either.
If HUD instead required banks to run tests on the models they use, then vendors would produce platforms that provide those reports as a service. That would be a value-add for lenders, Metcalf says, since banks would pay more for a disparate-impact report that was reliable, easy to run, and helped to keep them in compliance.
But under the proposed rule, it falls on the plaintiff to determine, case by case, how an algorithm affects them by suing the company or companies responsible for making the algorithm—without any standard in place for algorithmic fairness.
“How do you build a model to avoid disparate impact?” Metcalf says. “How often should it be tested? When does it need to be retested? How do you know if it’s appropriate from one population to another? Maybe it’s fair for the population of Ann Arbor. Maybe it’s unfair for the population of Detroit. How do you know which population it was trained on?”
He adds, “If HUD isn’t going to answer those questions, it’s a get-out-of-jail-free card. They’re creating the liability loopholes that all of the potential plaintiffs will fall through by default.”
Civil rights organizations are already gearing up for a fight over the rule. The National Fair Housing Alliance, Leadership Conference on Civil and Human Rights, NAACP, and others are joining forces under the banner of Defend Civil Rights. This new coalition aims to oppose efforts by the Trump administration to dial back regulations that safeguard minorities from discrimination, according to a civil rights attorney familiar with the project who couldn’t speak on the record before the group’s launch on Monday.
Defend Civil Rights will not only advocate for protections in housing: The coalition also plans to address education, labor, healthcare, environmental justice, and other fronts. For example, Secretary of Education Betsy DeVos has proposed canceling an Obama-era push to eliminate racial disparities in school discipline.
“When it comes to policymaking, most institutions, whether they’re lending institutions or landlords, have long since abandoned explicit racial [or other] discrimination. Disparate impact is really the best tool we have to level the playing field,” Greene says.
Defenders of the administration’s efforts say that it’s necessary to bring the department’s regulations in line with the Supreme Court’s 2015 decision in Texas Department of Housing and Community Affairs v. The Inclusive Communities Project. Francis Riley, a partner for Saul Ewing Arnstein & Lehr who represents defendants in the civil rights arena, says that the decision will constrain claims from plaintiffs.
“It puts the courts front-and-center to control claims that move on to discovery,” Riley says. “If [plaintiffs] are using a defendant’s [Home Mortgage Disclosure Act] data, or regional HMDA data, that shows that a particular area is not being served by the defendant, that is not enough. They have to actually assert, and in more than just a perfunctory way, that the lender has a policy or practice that they are effectively enforcing which has the goal of discriminating against those individuals.”
Riley says that the new rule will still allow plaintiffs to pursue landlords and lenders who are guilty of unfair housing practices. Those who discriminate should be hauled into court to answer for wrongful discrimination, he says. But he says the rule will prevent the plaintiff’s bar from bringing forward cases based on statistical data alone. While he thinks the new HUD regulation is a step in the right direction, he notes that it conflicts with other existing standards used by other federal agencies.
“We know what HUD’s doing,” Riley says. “What’s the [Consumer Financial Protection Bureau] going to do? What’s the [Federal Housing Administration] going to do? All of these departments have fair housing divisions.”
The language of the new regulation relies heavily on the text of former Justice Anthony Kennedy’s 5–4 decision for the majority in Inclusive Communities. The court ruled that disparate impact is “cognizable under the Fair Housing Act,” affirming prior decisions by eleven federal appellate courts that relied on this doctrine. Kennedy’s decision did not rely on the Obama-era HUD rule on disparate impact, which codified practices across the department. But the Trump administration saw the decision as a reason to revise the housing department’s rule.
In the Inclusive Communities decision, the court considered an ongoing challenge from the Dallas area. There, housing authorities had long been distributing housing tax credits, which are used to build low-income housing, in a way that consolidated construction in mostly black areas. This is a straightforward example of disparate impact, with none of the complications of machine learning or artificial intelligence that the new HUD rule anticipates.
On Friday, advocacy groups such as the National Low Income Housing Coalition and the National Community Reinvestment Coalition condemned the rule in strong terms. Civil rights attorneys worry that the new standard will unwind the protections afforded by the Fair Housing Act.
“Fundamentally, if this rule is adopted, and disparate impact is no longer available as a legal bulwark against facially neutral or unintentionally discriminatory policies,” Green says, “we’re in a lot of trouble.”