News:

And we're back!

Main Menu

The Off Topic Topic

Started by Korea, March 10, 2009, 06:24:26 AM

Previous topic - Next topic

DGuller

Quote from: Tyr on September 29, 2021, 07:42:10 AM
Because its basic logic.
If everyone on the team is pale there's a natural bias towards considering pale people. Very likely they won't even realise there are potential problems with darker people.
Even if they do realise this problem exists, will they understand it enough to tackle it properly?
Sure. It's feasible a totally white team could really care and research rigorously and build a full understanding.... But having a diverse team is a good shortcut to that. Really helps cover the unknown unknowns problem.
Let's just focus on the article that I replied to.  Can you please go into more technical detail as to the role lack of diversity played in the problem described in the article?  Maybe it's basic logic to you, but so far this looks like baseless speculation to me.  The article doesn't mention anything about diversity either, it's the connection made by you, so I want to hear what your thought process is.  Start with what caused the problem in general, and then you can point out what role lack of diversity had in it.

Grey Fox

Quote from: DGuller on September 29, 2021, 07:21:42 AM
Quote from: Tyr on September 29, 2021, 02:40:26 AM
Facial recognition is an interesting one with a lot of examples where lack of diversity has impacted things.
In part it's from the fundamentals of how we build our camera technology.

https://petapixel.com/2010/01/22/racist-camera-phenomenon-explained-almost/
See, this is a classic example of what I was talking about.  You've described the problem, and then handwaved in "lack of diversity impacted things".  What's completely missing here are two crucial connections:  how exactly did lack of diversity among data scientists lead to this problem, and how would making data scientists more diverse solve this problem?

Because we take pictures of ourselves.

(I haven't read the article. I just make Cameras for a living.)
Colonel Caliga is Awesome.

DGuller

Quote from: Grey Fox on September 29, 2021, 08:18:05 AM
Because we take pictures of ourselves.

(I haven't read the article. I just make Cameras for a living.)
Can you expound on it?  What does us taking pictures of ourselves have to do with cameras mistakenly detecting blinking in some Asian faces?

Josquius

Quote from: DGuller on September 29, 2021, 08:16:25 AM
Quote from: Tyr on September 29, 2021, 07:42:10 AM
Because its basic logic.
If everyone on the team is pale there's a natural bias towards considering pale people. Very likely they won't even realise there are potential problems with darker people.
Even if they do realise this problem exists, will they understand it enough to tackle it properly?
Sure. It's feasible a totally white team could really care and research rigorously and build a full understanding.... But having a diverse team is a good shortcut to that. Really helps cover the unknown unknowns problem.
Let's just focus on the article that I replied to.  Can you please go into more technical detail as to the role lack of diversity played in the problem described in the article?  Maybe it's basic logic to you, but so far this looks like baseless speculation to me.  The article doesn't mention anything about diversity either, it's the connection made by you, so I want to hear what your thought process is.  Start with what caused the problem in general, and then you can point out what role lack of diversity had in it.

I note you just quoted one article of a bunch I posted there. The one that deals with a different issue to the main one being discussed.

The blinking asian thing is curious as it comes from asian companies who you'd think would be used to this. A quick google doesn't give a definitive answer but I guess this came from the American wing of the company, building a product for the American market, and not bothering to properly test a diverse audience- perhaps even over-correcting pre existing asian software for the differences in white faces?
██████
██████
██████

crazy canuck

Quote from: DGuller on September 28, 2021, 07:42:47 PM
Quote from: Oexmelin on September 28, 2021, 07:23:50 PM
With that tone, I am not sure I am terribly inclined to play your little game.
Don't worry, I was well aware that you had nothing to offer even if you were inclined.  I'm glad I saved you the effort of writing four content-free paragraphs, and that I saved myself the effort of having to read them.

You don't understand that the concept Oex was attempting to explain to you?  Well not entirely surprising, but don't be a fucking asshole on top of your ignorance.

HVC

Bomb scare near me yesterday. Cops blew it up around 730
Being lazy is bad; unless you still get what you want, then it's called "patience".
Hubris must be punished. Severely.

Grey Fox

Quote from: DGuller on September 29, 2021, 08:22:22 AM
Quote from: Grey Fox on September 29, 2021, 08:18:05 AM
Because we take pictures of ourselves.

(I haven't read the article. I just make Cameras for a living.)
Can you expound on it?  What does us taking pictures of ourselves have to do with cameras mistakenly detecting blinking in some Asian faces?

There are, for the purpose of simplification, 2 main facets to creating a digital image.

1) Image processing
2) Sensor light interpretation. Photons are real!

1) Image processing is, nowadays, 2 things. AI & Algorithms.

Image processing AIs are mainly creating by feeding them pictures. A lot of easily available images online are of white people. You have to make an effort to use black people images. Never had to look for Asian people images. Must be harder when you need to avoid China at all cost.

Image processing Algorithms suffer from the same thing that Sensors do and while that can be mitigated, the problem is created, like Tyr mentions, by developer testing laziness. If the only people around are white, that's what it get tested on.

2) Sensors.

Modern digital sensors interpret light by filling up physical wells with photons. More photons means more light. We usually use a 1 to 255 (or 0-254) system to measure how much light we have acquired. Smooth surfaces tend to reflect more light, especially as we get closer to saturation (255). There are not a lot of differences between 230 and 255 for the human eye. Asian facial structure are, I think, usually more smooth. That probably throws off all sorts of inflection points where we see weird artifacts. An appropriate lightning solution to the application should mitigate does problem. Or a software solution that need testing.
Colonel Caliga is Awesome.

DGuller

Quote from: Tyr on September 29, 2021, 08:40:01 AM
I note you just quoted one article of a bunch I posted there. The one that deals with a different issue to the main one being discussed.
Did I quote it out of context?  I thought the part that I quoted was self-contained, and I focused on it for the sake of brevity.
Quote
The blinking asian thing is curious as it comes from asian companies who you'd think would be used to this. A quick google doesn't give a definitive answer but I guess this came from the American wing of the company, building a product for the American market, and not bothering to properly test a diverse audience- perhaps even over-correcting pre existing asian software for the differences in white faces?
Again, this is so much speculation.  I don't expect you to know the technical reasons for this happening, but I'm just noting how people are conditioned to make such leaps without understanding the issues on even a basic level.  The article you linked made no connection to diversity (not that you can't find plenty of garbage articles that do), you have no special expertise or technical knowledge to make such a connection yourself, and yet you readily jump to that conclusion. 

I don't mean to single you out, you were the only one here to put yourself out there and present arguments while you were far from the only one to make such leaps, I just want to point out how natural it is to blame "lack of diversity" without having any understanding of how it affects anything.  A lot of nonsense thinking becomes engrained as common sense when you start with a conclusion and go from there.  Another ironic thing is that about 20% of American data scientists are Asian, which is a vastly greater percentage than their share of US population, and yet their presence didn't seem to prevent anti-Asian racism by cameras.

DGuller

Quote from: Grey Fox on September 29, 2021, 09:04:47 AM
If the only people around are white, that's what it get tested on.
Are you assuming that the way AI algorithms are tested is that all the data scientists in the office get together, snap pictures of themselves, and evaluate the results?  That seems like what you're implying.

Grey Fox

Quote from: DGuller on September 29, 2021, 09:08:17 AM
Quote from: Grey Fox on September 29, 2021, 09:04:47 AM
If the only people around are white, that's what it get tested on.
Are you assuming that the way AI algorithms are tested is that all the data scientists in the office get together, snap pictures of themselves, and evaluate the results?  That seems like what you're implying.

Yes. When your goal is to identify people.

Actually, creating image processing AI (not all AIs are Image processing or equal!) is extremely time consuming process so we use already created AI bases & further polish them to make it useable for a real world application.  Some of those AI bases are relatively old and white biased. IPAIs are not Jack of all trade things.
Colonel Caliga is Awesome.

DGuller

Quote from: Grey Fox on September 29, 2021, 09:12:18 AM
Quote from: DGuller on September 29, 2021, 09:08:17 AM
Quote from: Grey Fox on September 29, 2021, 09:04:47 AM
If the only people around are white, that's what it get tested on.
Are you assuming that the way AI algorithms are tested is that all the data scientists in the office get together, snap pictures of themselves, and evaluate the results?  That seems like what you're implying.

Yes. When your goal is to identify people.
I don't know for a fact what every company does to test their AI algorithms, but if that's how they test them, then they have vastly bigger problems than lack of diversity.  Given that it's an unfathomable level of incompetence, though, I'm not going to assume that this is indeed how they test AI algorithms.

Grey Fox

Colonel Caliga is Awesome.

ulmont

Quote from: DGuller on September 29, 2021, 09:06:15 AM
people are conditioned to make such leaps without understanding the issues on even a basic level.

But only the non-STEM trained people are culturally conditioned, of course.  STEMlords are above such things.

DGuller

Quote from: Grey Fox on September 29, 2021, 09:18:00 AM
What did you expect?
At the bare minimum, I expected cross-validation or holdout testing on a statistically significant and representative library of labeled images.

Grey Fox

Quote from: DGuller on September 29, 2021, 09:20:36 AM
Quote from: Grey Fox on September 29, 2021, 09:18:00 AM
What did you expect?
At the bare minimum, I expected cross-validation or holdout testing on a statistically significant and representative library of labeled images.

Man, that would be awesome.

Colonel Caliga is Awesome.