News:

And we're back!

Main Menu

The Off Topic Topic

Started by Korea, March 10, 2009, 06:24:26 AM

Previous topic - Next topic

Jacob


ulmont

Quote from: DGuller on September 28, 2021, 07:59:29 PM
Another big problem is that things are done very quickly and very sloppily, for the most part.  There isn't enough of an engineering culture to make sure that details are taken care of with due diligence, and there aren't enough skilled data scientists around to put it in practice.

...and definitely not enough female or black engineers to actually notice things that might disproportionately impact women or blacks, which I'm sure is where Oex would have ended up if you had been anything but a total dick in framing your "question."

DGuller

Quote from: ulmont on September 28, 2021, 10:39:20 PM
Quote from: DGuller on September 28, 2021, 07:59:29 PM
Another big problem is that things are done very quickly and very sloppily, for the most part.  There isn't enough of an engineering culture to make sure that details are taken care of with due diligence, and there aren't enough skilled data scientists around to put it in practice.

...and definitely not enough female or black engineers to actually notice things that might disproportionately impact women or blacks, which I'm sure is where Oex would have ended up if you had been anything but a total dick in framing your "question."
If that's where Oex would've ended up, then I'm glad he quit before going there.  I know casually and baselessly accusing people in STEM fields of bias, sorry, unconscious bias, is in vogue these days, but if you're going to argue that you should hire less white and Asian men for data science roles, you're going to need to come up with something less speculative than "white and Asian men are incapable of noticing disproportionate impact on women and blacks".

ulmont

#82548
Quote from: DGuller on September 28, 2021, 11:01:12 PM
if you're going to argue that you should hire less white and Asian men for data science roles, you're going to need to come up with something less speculative than "white and Asian men are incapable of noticing disproportionate impact on women and blacks".

Yes, yes, it's probably nothing other than coincidence that women and blacks notice systematic biases in favor of white people and men more often than white people and men do.

If only someone could explain this to us with data.

...failing data, possibly we could turn to the wisdom of the ancients.

Let's ask Cicero.  Cicero, what do you think?

Quote from: Cicerocui bono?

Eddie Teach

It's also possible women and minorities are more conditioned to attribute adverse outcomes to systemic bias rather than bad luck.
To sleep, perchance to dream. But in that sleep of death, what dreams may come?

Syt

Quote from: Eddie Teach on September 28, 2021, 11:34:12 PM
It's also possible women and minorities are more conditioned to attribute adverse outcomes to systemic bias rather than bad luck.

:rolleyes:
I am, somehow, less interested in the weight and convolutions of Einstein's brain than in the near certainty that people of equal talent have lived and died in cotton fields and sweatshops.
—Stephen Jay Gould

Proud owner of 42 Zoupa Points.

Admiral Yi

I can't see how anyone benefits from a biased ad generation or search algorithm.

Eddie Teach

Case in point: you don't believe the system is biased against you, so the probability of you doing so is 0ish.
To sleep, perchance to dream. But in that sleep of death, what dreams may come?

Josquius

#82553
Facial recognition is an interesting one with a lot of examples where lack of diversity has impacted things.
In part it's from the fundamentals of how we build our camera technology.

https://petapixel.com/2010/01/22/racist-camera-phenomenon-explained-almost/

I have definitely noticed this in my life - being in a photo together with a very dark skinned person one of us will be poorly focused and hard to see.

One story I really like is how this has created a gap in the market.

https://www.cnn.com/2018/10/10/tech/tecno-phones-africa/index.html

The problem persists alas.
.https://www.wired.com/story/best-algorithms-struggle-recognize-black-faces-equally/


As well as just skin tone there's also the "All Chinese people look the same" factor.
This is a very real phenomena where if you don't have much experience with people of another race you can have trouble spotting the difference. The key is that people of different races tend to sub consciously look for different things in identifying others.
In Japan I had big trouble telling people apart at first (and always got mistaken for people I look nothing like. But hey ho). The thing is whilst in the west we will look at eye and hair colour, nose size, etc... In Japan it's far more about face shape and other things we don't really consider.
It definitely seems feasible that a lack of people from certain ethnic groups making facial recognition algorithms would mean the algorithms don't properly cover what they look for.
██████
██████
██████

Tamas

Quote from: Admiral Yi on September 28, 2021, 11:55:57 PM
I can't see how anyone benefits from a biased ad generation or search algorithm.

:huh: That's (the ad generation at least) is the very core of how most online ads work these days. They monitor your browsing with the help of Google and Facebook, and show you ads which cater to your biases and interests.

Besides, I can't believe we are really arguing whether data scientists can be influenced by their own views and biases OR if they are superhuman instead.

Admiral Yi

Tamas, bias can have different meanings.  I was using it in the sense of inaccurate.  In statistics bias means your sample does not reflect the underlying population.

celedhring

The La Palma volcano is still going, tonight the lava finally reached the sea.


Josquius

I'd love to hear the sizzle.
██████
██████
██████

DGuller

Quote from: Tyr on September 29, 2021, 02:40:26 AM
Facial recognition is an interesting one with a lot of examples where lack of diversity has impacted things.
In part it's from the fundamentals of how we build our camera technology.

https://petapixel.com/2010/01/22/racist-camera-phenomenon-explained-almost/
See, this is a classic example of what I was talking about.  You've described the problem, and then handwaved in "lack of diversity impacted things".  What's completely missing here are two crucial connections:  how exactly did lack of diversity among data scientists lead to this problem, and how would making data scientists more diverse solve this problem?

Josquius

Quote from: DGuller on September 29, 2021, 07:21:42 AM
Quote from: Tyr on September 29, 2021, 02:40:26 AM
Facial recognition is an interesting one with a lot of examples where lack of diversity has impacted things.
In part it's from the fundamentals of how we build our camera technology.

https://petapixel.com/2010/01/22/racist-camera-phenomenon-explained-almost/
See, this is a classic example of what I was talking about.  You've described the problem, and then handwaved in "lack of diversity impacted things".  What's completely missing here are two crucial connections:  how exactly did lack of diversity among data scientists lead to this problem, and how would making data scientists more diverse solve this problem?
Because its basic logic.
If everyone on the team is pale there's a natural bias towards considering pale people. Very likely they won't even realise there are potential problems with darker people.
Even if they do realise this problem exists, will they understand it enough to tackle it properly?
Sure. It's feasible a totally white team could really care and research rigorously and build a full understanding.... But having a diverse team is a good shortcut to that. Really helps cover the unknown unknowns problem.
██████
██████
██████