Quote from: Admiral Yi on October 08, 2025, 11:21:24 PMOK, that sounds like a different argument than the one you were making previously.It's the same damn argument I made in the very first post.
Quote from: DGuller on October 07, 2025, 07:34:41 AMThe issue where tenants with eviction history can't find an apartment is also part of a much bigger problem not tied to homelessness. It's one of the older AI problems I was always concerned about, where many seemingly independent actors all buy a vendor solution, and in effect become a monopolist even without intending to do so.
At individual level, it makes sense to screen out tenants with a bad "tenant score", whatever the reasons happen to be. At a collective level, though, people who fall on the wrong side of the model, either for good reasons, or for reasons of being an unlucky residual, get frozen out of the market entirely. Previously you may find some luck with some landlord, but if all of them now use same or similar solutions to screen out potential nightmare tenants, then these potential nightmare tenants are fucked.
In car insurance, we have insurers of last resort or risk pools, because we understand that even justifiably bad risks often need a car to functional in a US society. That's why there are various schemes to force insurers to deal with them, even if all the good models as well as common sense tells us they're a loss. I think it's way past time the same concept was extended to housing, or access to financial services.
QuoteOut of curiosity, what would be an example of a bad tenant that was unfairly rated?The word "fair" is actually very loaded. Even people in good faith can view it differently, and there are a lot of people right now hostile to AI that are using it in very bad faith. I will avoid using this word here.
Quote from: Admiral Yi on October 08, 2025, 10:18:48 PMYes, the one common thread is that information is disseminated. The bad tenant and the bad driver and the welcher might prefer that information not be disseminated, but everyone on the other side of the deal is happy to have it. There's nothing unfair or unjust about the legal acquisition of infomation and acting on it.I'll go back to one of my earlier points, in that an individual may just be an unlucky residual of a statistical model. What you call "information" may be lot less deterministic than you make it sound. Models are directionally right on average if they're competently built, but they may still be systematically wrong on an individual level. Car insurance company doesn't observe you every second you're behind the wheel, they estimate whether you're a "good driver" or a "bad driver" based on some criteria, which are valid but incomplete.
Quote from: Jacob on October 08, 2025, 08:31:00 PMHow likely are RN and LFI respectively to undermine democracy? Where do they stand on Europe? And how likely are they withdraw support of Ukraine?
Quote from: DGuller on October 08, 2025, 08:09:31 PMLet's focus on the common thread rather than the fact that compounding the natural imperfection of analogies can eventually add up. What's common in all the analogies is that the decision of one market actor, when communicated to other actors, can create a coordination that freezes someone out. One landlord evicting the tenant gets communicated through background checks and prevents the tenant even from an opportunity to get another place. One gambler refuses to deal with another and badmouths them to other gambles so that they also refuse to deal with them. One employer decides to fire an employee and through references makes them appear to be a bad hire to other employers.
The justification for spreading the information is secondary; what matters is the resulting coordination that turns private discretion into systemic exclusion. Once that happens, the market stops being free in any meaningful sense.
Page created in 0.018 seconds with 11 queries.